Practice 2. Recurrent Neural Networks¶

  • Alejandro Dopico Castro (alejandro.dopico2@udc.es).
  • Ana Xiangning Pereira Ezquerro (ana.ezquerro@udc.es).

The following notebook contains execution examples of the recurrent neural architecture proposed for the Walmart dataset. The Python scripts submitted include auxiliar code to simplify the readibility of the code cells.

  • data.py: Includes the WalmartDataset class to instantiate each dataset.
  • model.py: Includes the WalmartModel class to instantiate a model with fixed hyperparameters and the DenormalizedMAE metric to use in the fit() Keras method.
  • plots.py: Includes auxiliary functions to display the time series performance of model predictions.

Note: To properly visualize and interact with the Plotly graphs we recommend using the [walmart.html] file.

In [32]:
from data import * 
from plots import *
from model import *
from utils import *
from keras.layers import * 
from keras.models import Sequential, Model
from keras.optimizers import Adam, Optimizer, RMSprop
from keras.callbacks import EarlyStopping, ModelCheckpoint
from keras.regularizers import L1, L2, L1L2
from tensorflow.data import Dataset
from itertools import product 
from collections import OrderedDict
import plotly.offline as pyo
pyo.init_notebook_mode()

Regularizer.__str__ = lambda x: str(x.__class__.__name__)
Optimizer.__str__ = lambda x: str(x.__class__.__name__) + f'({float(x.learning_rate.numpy()):1.0e})'


# global parameters 
TEST_RATIO = 0.2
VAL_RATIO = 0.15
BATCH_SIZE = 200

# load data 
data = WalmartDataset.load('Walmart.csv')
train, val, test = data.split(VAL_RATIO, TEST_RATIO)

Recurrent Neural Model¶

To model the temporal relations in the stream data, our neural architecture is a recurrent encoder ($\mathcal{E}$) with $\ell$ hidden layers of dimension $d_h$ that project the input sequence $\mathbf{X}\in\mathbb{R}^{S\times d_x}$ to a time-contextualized sequence of embeddings $\mathbf{H} = \mathcal{E}(\mathbf{X}) \in \mathbb{R}^{S\times d_h}$ (where $d_x$ and $d_h$ denote respectively the number of input features and the hidden dimension of the model and $S$ denote the sequence length). The result $\mathbf{H}$ is passed through a final recurrent layer (LSTM-based) and the final state $\tilde{\mathbf{h}}\in\mathbb{R}^{d_h}$ is used as a summarization of the sample. This representation is then passed to a feed-forward decoder composed of $\varphi$ dense layers, where the last one is constrained with a linear activation to predict the output value $\hat{y}$ (number of sales expected for the timestep $t+2$).

In this section we explored three possible values for the hyperparamenter $S$ to validate the impact of the past observation in the sales modelling, maintaining the other hyperparemeters (number of layers, model dimension, activations, etc.) with default values. The default configuration (baseline) uses an encoder of 2-stacked LSTMs with a decoder of 2 feed-forward networks. The only regularization method used is dropout (10%). This naive network can be easily improved, but we decided to start with the simplest architecture and incrementally increase the complexity of the model while controlling the overfitting with regularization methods.

In [3]:
# S = 2
model2 = WalmartModel(2, hidden_size=10)
model2.train(train, val, 'results/walmart2.weights.h5', Adam(1e-3), batch_size=BATCH_SIZE)
model2.evaluate(test)
Epoch 1/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 4s 45ms/step - dmae: 542466.0625 - loss: 1.2749 - mae: 0.9731 - val_dmae: 482828.5000 - val_loss: 1.1356 - val_mae: 0.8661
Epoch 2/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 529696.8750 - loss: 1.2212 - mae: 0.9502 - val_dmae: 451427.3438 - val_loss: 0.9896 - val_mae: 0.8098
Epoch 3/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 476533.1250 - loss: 1.0042 - mae: 0.8548 - val_dmae: 320770.8438 - val_loss: 0.4981 - val_mae: 0.5754
Epoch 4/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 283703.5625 - loss: 0.3883 - mae: 0.5089 - val_dmae: 196077.3125 - val_loss: 0.2602 - val_mae: 0.3517
Epoch 5/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 133294.7500 - loss: 0.1291 - mae: 0.2391 - val_dmae: 177371.0000 - val_loss: 0.2245 - val_mae: 0.3182
Epoch 6/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 120702.5078 - loss: 0.1169 - mae: 0.2165 - val_dmae: 173767.5781 - val_loss: 0.2200 - val_mae: 0.3117
Epoch 7/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 117085.4062 - loss: 0.1121 - mae: 0.2100 - val_dmae: 171468.9688 - val_loss: 0.2138 - val_mae: 0.3076
Epoch 8/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 115684.9531 - loss: 0.1109 - mae: 0.2075 - val_dmae: 170553.6406 - val_loss: 0.2112 - val_mae: 0.3060
Epoch 9/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113892.6484 - loss: 0.1075 - mae: 0.2043 - val_dmae: 169441.3125 - val_loss: 0.2082 - val_mae: 0.3040
Epoch 10/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112650.3906 - loss: 0.1062 - mae: 0.2021 - val_dmae: 168496.5781 - val_loss: 0.2052 - val_mae: 0.3023
Epoch 11/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113185.1719 - loss: 0.1087 - mae: 0.2030 - val_dmae: 167545.3125 - val_loss: 0.2021 - val_mae: 0.3006
Epoch 12/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113233.1484 - loss: 0.1059 - mae: 0.2031 - val_dmae: 167308.2500 - val_loss: 0.1998 - val_mae: 0.3001
Epoch 13/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112262.5703 - loss: 0.1044 - mae: 0.2014 - val_dmae: 167006.6406 - val_loss: 0.1984 - val_mae: 0.2996
Epoch 14/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113321.8750 - loss: 0.1046 - mae: 0.2033 - val_dmae: 167198.8125 - val_loss: 0.1971 - val_mae: 0.2999
Epoch 15/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 113176.3359 - loss: 0.1042 - mae: 0.2030 - val_dmae: 166525.8594 - val_loss: 0.1947 - val_mae: 0.2987
Epoch 16/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112381.8125 - loss: 0.1026 - mae: 0.2016 - val_dmae: 165626.2031 - val_loss: 0.1924 - val_mae: 0.2971
Epoch 17/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112691.5000 - loss: 0.1029 - mae: 0.2022 - val_dmae: 165702.7500 - val_loss: 0.1905 - val_mae: 0.2973
Epoch 18/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 113656.5312 - loss: 0.1049 - mae: 0.2039 - val_dmae: 164918.7500 - val_loss: 0.1880 - val_mae: 0.2958
Epoch 19/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 113891.9297 - loss: 0.1029 - mae: 0.2043 - val_dmae: 164257.3125 - val_loss: 0.1861 - val_mae: 0.2947
Epoch 20/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113350.8906 - loss: 0.1020 - mae: 0.2033 - val_dmae: 163191.5312 - val_loss: 0.1831 - val_mae: 0.2927
Epoch 21/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 112985.3516 - loss: 0.1034 - mae: 0.2027 - val_dmae: 162745.6562 - val_loss: 0.1816 - val_mae: 0.2919
Epoch 22/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 113113.8594 - loss: 0.1030 - mae: 0.2029 - val_dmae: 162130.7500 - val_loss: 0.1797 - val_mae: 0.2908
Epoch 23/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112037.4297 - loss: 0.0988 - mae: 0.2010 - val_dmae: 160907.2188 - val_loss: 0.1766 - val_mae: 0.2887
Epoch 24/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 112930.2344 - loss: 0.1011 - mae: 0.2026 - val_dmae: 160469.9688 - val_loss: 0.1751 - val_mae: 0.2879
Epoch 25/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111846.2500 - loss: 0.0988 - mae: 0.2006 - val_dmae: 159643.8750 - val_loss: 0.1730 - val_mae: 0.2864
Epoch 26/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112309.9297 - loss: 0.0995 - mae: 0.2015 - val_dmae: 158771.9375 - val_loss: 0.1702 - val_mae: 0.2848
Epoch 27/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110020.6875 - loss: 0.0967 - mae: 0.1974 - val_dmae: 158217.5156 - val_loss: 0.1692 - val_mae: 0.2838
Epoch 28/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111687.5703 - loss: 0.0975 - mae: 0.2004 - val_dmae: 157435.3281 - val_loss: 0.1672 - val_mae: 0.2824
Epoch 29/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111681.6797 - loss: 0.0976 - mae: 0.2003 - val_dmae: 156246.0625 - val_loss: 0.1646 - val_mae: 0.2803
Epoch 30/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112036.5859 - loss: 0.1020 - mae: 0.2010 - val_dmae: 155778.1562 - val_loss: 0.1635 - val_mae: 0.2794
Epoch 31/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111349.8750 - loss: 0.0993 - mae: 0.1997 - val_dmae: 155304.1250 - val_loss: 0.1619 - val_mae: 0.2786
Epoch 32/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 109372.3359 - loss: 0.0954 - mae: 0.1962 - val_dmae: 154837.9844 - val_loss: 0.1608 - val_mae: 0.2778
Epoch 33/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110124.2031 - loss: 0.0962 - mae: 0.1976 - val_dmae: 153661.8125 - val_loss: 0.1587 - val_mae: 0.2757
Epoch 34/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111761.6094 - loss: 0.0966 - mae: 0.2005 - val_dmae: 153093.3750 - val_loss: 0.1575 - val_mae: 0.2746
Epoch 35/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 109890.2266 - loss: 0.0937 - mae: 0.1971 - val_dmae: 151940.3281 - val_loss: 0.1552 - val_mae: 0.2726
Epoch 36/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110978.9141 - loss: 0.0949 - mae: 0.1991 - val_dmae: 152149.7344 - val_loss: 0.1549 - val_mae: 0.2729
Epoch 37/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111047.1562 - loss: 0.0962 - mae: 0.1992 - val_dmae: 151276.1406 - val_loss: 0.1535 - val_mae: 0.2714
Epoch 38/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110416.0312 - loss: 0.0932 - mae: 0.1981 - val_dmae: 150854.2344 - val_loss: 0.1522 - val_mae: 0.2706
Epoch 39/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108360.7578 - loss: 0.0916 - mae: 0.1944 - val_dmae: 149624.4531 - val_loss: 0.1505 - val_mae: 0.2684
Epoch 40/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110382.8125 - loss: 0.0931 - mae: 0.1980 - val_dmae: 148468.9688 - val_loss: 0.1488 - val_mae: 0.2663
Epoch 41/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 108698.6172 - loss: 0.0894 - mae: 0.1950 - val_dmae: 148712.5781 - val_loss: 0.1482 - val_mae: 0.2668
Epoch 42/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 109725.9766 - loss: 0.0910 - mae: 0.1968 - val_dmae: 148072.0312 - val_loss: 0.1471 - val_mae: 0.2656
Epoch 43/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108324.7344 - loss: 0.0889 - mae: 0.1943 - val_dmae: 147546.8750 - val_loss: 0.1466 - val_mae: 0.2647
Epoch 44/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 109750.2891 - loss: 0.0924 - mae: 0.1969 - val_dmae: 147162.1562 - val_loss: 0.1459 - val_mae: 0.2640
Epoch 45/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108304.9375 - loss: 0.0898 - mae: 0.1943 - val_dmae: 146951.3281 - val_loss: 0.1453 - val_mae: 0.2636
Epoch 46/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107908.9453 - loss: 0.0876 - mae: 0.1936 - val_dmae: 146345.2031 - val_loss: 0.1441 - val_mae: 0.2625
Epoch 47/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 111194.6484 - loss: 0.0939 - mae: 0.1995 - val_dmae: 146156.3906 - val_loss: 0.1435 - val_mae: 0.2622
Epoch 48/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 108984.6250 - loss: 0.0897 - mae: 0.1955 - val_dmae: 146511.0469 - val_loss: 0.1434 - val_mae: 0.2628
Epoch 49/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107806.2812 - loss: 0.0873 - mae: 0.1934 - val_dmae: 145242.2188 - val_loss: 0.1423 - val_mae: 0.2605
Epoch 50/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 106928.0391 - loss: 0.0884 - mae: 0.1918 - val_dmae: 145242.1406 - val_loss: 0.1418 - val_mae: 0.2605
Epoch 51/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 109188.1562 - loss: 0.0915 - mae: 0.1959 - val_dmae: 145059.5781 - val_loss: 0.1416 - val_mae: 0.2602
Epoch 52/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 108148.0391 - loss: 0.0880 - mae: 0.1940 - val_dmae: 144448.9844 - val_loss: 0.1408 - val_mae: 0.2591
Epoch 53/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107990.7656 - loss: 0.0888 - mae: 0.1937 - val_dmae: 144357.6094 - val_loss: 0.1404 - val_mae: 0.2590
Epoch 54/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 107526.2656 - loss: 0.0877 - mae: 0.1929 - val_dmae: 143896.8906 - val_loss: 0.1398 - val_mae: 0.2581
Epoch 55/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107085.8594 - loss: 0.0867 - mae: 0.1921 - val_dmae: 143877.1562 - val_loss: 0.1395 - val_mae: 0.2581
Epoch 56/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 107904.9688 - loss: 0.0883 - mae: 0.1936 - val_dmae: 143210.7031 - val_loss: 0.1387 - val_mae: 0.2569
Epoch 57/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108230.6016 - loss: 0.0871 - mae: 0.1942 - val_dmae: 143528.7812 - val_loss: 0.1388 - val_mae: 0.2575
Epoch 58/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108813.1562 - loss: 0.0887 - mae: 0.1952 - val_dmae: 143870.9375 - val_loss: 0.1391 - val_mae: 0.2581
Epoch 59/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 108596.9219 - loss: 0.0884 - mae: 0.1948 - val_dmae: 142628.6406 - val_loss: 0.1383 - val_mae: 0.2559
Epoch 60/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 107074.8047 - loss: 0.0867 - mae: 0.1921 - val_dmae: 143124.1719 - val_loss: 0.1380 - val_mae: 0.2567
Epoch 61/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108068.4219 - loss: 0.0877 - mae: 0.1939 - val_dmae: 143261.3281 - val_loss: 0.1381 - val_mae: 0.2570
Epoch 62/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 3s 32ms/step - dmae: 107847.1641 - loss: 0.0868 - mae: 0.1935 - val_dmae: 142399.4844 - val_loss: 0.1370 - val_mae: 0.2554
Epoch 63/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108915.8359 - loss: 0.0876 - mae: 0.1954 - val_dmae: 142658.0625 - val_loss: 0.1369 - val_mae: 0.2559
Epoch 64/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106799.2969 - loss: 0.0860 - mae: 0.1916 - val_dmae: 142327.2188 - val_loss: 0.1363 - val_mae: 0.2553
Epoch 65/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105874.8125 - loss: 0.0833 - mae: 0.1899 - val_dmae: 142008.0625 - val_loss: 0.1363 - val_mae: 0.2547
Epoch 66/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108118.6641 - loss: 0.0855 - mae: 0.1940 - val_dmae: 141454.9531 - val_loss: 0.1356 - val_mae: 0.2538
Epoch 67/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 109463.1953 - loss: 0.0893 - mae: 0.1964 - val_dmae: 141685.8750 - val_loss: 0.1352 - val_mae: 0.2542
Epoch 68/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105825.5234 - loss: 0.0858 - mae: 0.1898 - val_dmae: 140991.7812 - val_loss: 0.1351 - val_mae: 0.2529
Epoch 69/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106448.1719 - loss: 0.0838 - mae: 0.1910 - val_dmae: 140771.2031 - val_loss: 0.1346 - val_mae: 0.2525
Epoch 70/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104940.3516 - loss: 0.0825 - mae: 0.1883 - val_dmae: 141016.3750 - val_loss: 0.1346 - val_mae: 0.2530
Epoch 71/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106284.2344 - loss: 0.0846 - mae: 0.1907 - val_dmae: 140602.1562 - val_loss: 0.1338 - val_mae: 0.2522
Epoch 72/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106217.8594 - loss: 0.0820 - mae: 0.1905 - val_dmae: 140164.5625 - val_loss: 0.1336 - val_mae: 0.2514
Epoch 73/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105000.2891 - loss: 0.0818 - mae: 0.1884 - val_dmae: 140387.6406 - val_loss: 0.1332 - val_mae: 0.2518
Epoch 74/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106409.8359 - loss: 0.0834 - mae: 0.1909 - val_dmae: 139983.8594 - val_loss: 0.1331 - val_mae: 0.2511
Epoch 75/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106516.8438 - loss: 0.0852 - mae: 0.1911 - val_dmae: 139730.4844 - val_loss: 0.1326 - val_mae: 0.2507
Epoch 76/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105437.2344 - loss: 0.0820 - mae: 0.1891 - val_dmae: 139861.8281 - val_loss: 0.1328 - val_mae: 0.2509
Epoch 77/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106327.5234 - loss: 0.0846 - mae: 0.1907 - val_dmae: 139872.5938 - val_loss: 0.1324 - val_mae: 0.2509
Epoch 78/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106146.8438 - loss: 0.0838 - mae: 0.1904 - val_dmae: 139626.1875 - val_loss: 0.1321 - val_mae: 0.2505
Epoch 79/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 107474.9922 - loss: 0.0847 - mae: 0.1928 - val_dmae: 139721.1719 - val_loss: 0.1324 - val_mae: 0.2506
Epoch 80/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107348.9219 - loss: 0.0847 - mae: 0.1926 - val_dmae: 139577.4062 - val_loss: 0.1320 - val_mae: 0.2504
Epoch 81/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106979.7734 - loss: 0.0847 - mae: 0.1919 - val_dmae: 138549.5312 - val_loss: 0.1311 - val_mae: 0.2485
Epoch 82/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105385.9531 - loss: 0.0832 - mae: 0.1891 - val_dmae: 138810.9219 - val_loss: 0.1309 - val_mae: 0.2490
Epoch 83/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105964.1875 - loss: 0.0833 - mae: 0.1901 - val_dmae: 139345.7812 - val_loss: 0.1312 - val_mae: 0.2500
Epoch 84/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104889.5781 - loss: 0.0820 - mae: 0.1882 - val_dmae: 138710.0625 - val_loss: 0.1308 - val_mae: 0.2488
Epoch 85/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 105695.7031 - loss: 0.0836 - mae: 0.1896 - val_dmae: 138363.7500 - val_loss: 0.1302 - val_mae: 0.2482
Epoch 86/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 105218.9688 - loss: 0.0839 - mae: 0.1888 - val_dmae: 139187.0781 - val_loss: 0.1305 - val_mae: 0.2497
Epoch 87/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105252.5781 - loss: 0.0844 - mae: 0.1888 - val_dmae: 138020.2031 - val_loss: 0.1299 - val_mae: 0.2476
Epoch 88/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104995.2734 - loss: 0.0826 - mae: 0.1884 - val_dmae: 137290.6875 - val_loss: 0.1292 - val_mae: 0.2463
Epoch 89/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105696.9922 - loss: 0.0826 - mae: 0.1896 - val_dmae: 138386.2031 - val_loss: 0.1299 - val_mae: 0.2483
Epoch 90/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 104927.4141 - loss: 0.0818 - mae: 0.1882 - val_dmae: 137169.7188 - val_loss: 0.1290 - val_mae: 0.2461
Epoch 91/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 104667.6094 - loss: 0.0822 - mae: 0.1878 - val_dmae: 137768.0625 - val_loss: 0.1292 - val_mae: 0.2471
Epoch 92/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 103445.3047 - loss: 0.0821 - mae: 0.1856 - val_dmae: 137818.1250 - val_loss: 0.1293 - val_mae: 0.2472
Epoch 93/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 103454.4453 - loss: 0.0787 - mae: 0.1856 - val_dmae: 137255.6719 - val_loss: 0.1284 - val_mae: 0.2462
Epoch 94/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104167.6328 - loss: 0.0809 - mae: 0.1869 - val_dmae: 136743.0469 - val_loss: 0.1280 - val_mae: 0.2453
Epoch 95/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 103535.7812 - loss: 0.0788 - mae: 0.1857 - val_dmae: 136691.6406 - val_loss: 0.1279 - val_mae: 0.2452
Epoch 96/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104826.0703 - loss: 0.0823 - mae: 0.1880 - val_dmae: 136810.1406 - val_loss: 0.1279 - val_mae: 0.2454
Epoch 97/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106355.6016 - loss: 0.0829 - mae: 0.1908 - val_dmae: 136496.6406 - val_loss: 0.1273 - val_mae: 0.2449
Epoch 98/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105626.4219 - loss: 0.0839 - mae: 0.1895 - val_dmae: 136440.7812 - val_loss: 0.1276 - val_mae: 0.2448
Epoch 99/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104637.9688 - loss: 0.0823 - mae: 0.1877 - val_dmae: 136642.5625 - val_loss: 0.1270 - val_mae: 0.2451
Epoch 100/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104655.5000 - loss: 0.0811 - mae: 0.1877 - val_dmae: 136200.2188 - val_loss: 0.1271 - val_mae: 0.2443
Epoch 101/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104018.0781 - loss: 0.0807 - mae: 0.1866 - val_dmae: 136431.9844 - val_loss: 0.1267 - val_mae: 0.2447
Epoch 102/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104705.2578 - loss: 0.0799 - mae: 0.1878 - val_dmae: 135655.1406 - val_loss: 0.1261 - val_mae: 0.2434
Epoch 103/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104673.5547 - loss: 0.0800 - mae: 0.1878 - val_dmae: 136366.9375 - val_loss: 0.1266 - val_mae: 0.2446
Epoch 104/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102701.0391 - loss: 0.0790 - mae: 0.1842 - val_dmae: 135848.4688 - val_loss: 0.1261 - val_mae: 0.2437
Epoch 105/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105535.8281 - loss: 0.0829 - mae: 0.1893 - val_dmae: 136197.0156 - val_loss: 0.1263 - val_mae: 0.2443
Epoch 106/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102814.2266 - loss: 0.0789 - mae: 0.1844 - val_dmae: 136579.7188 - val_loss: 0.1269 - val_mae: 0.2450
Epoch 107/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104104.2266 - loss: 0.0808 - mae: 0.1868 - val_dmae: 134461.6562 - val_loss: 0.1247 - val_mae: 0.2412
Epoch 108/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102041.6641 - loss: 0.0780 - mae: 0.1831 - val_dmae: 135754.6875 - val_loss: 0.1254 - val_mae: 0.2435
Epoch 109/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103270.1797 - loss: 0.0788 - mae: 0.1853 - val_dmae: 135350.5625 - val_loss: 0.1254 - val_mae: 0.2428
Epoch 110/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104910.7344 - loss: 0.0816 - mae: 0.1882 - val_dmae: 134735.2188 - val_loss: 0.1251 - val_mae: 0.2417
Epoch 111/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103093.6016 - loss: 0.0793 - mae: 0.1849 - val_dmae: 135348.1875 - val_loss: 0.1248 - val_mae: 0.2428
Epoch 112/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 104613.5078 - loss: 0.0824 - mae: 0.1877 - val_dmae: 136088.3594 - val_loss: 0.1257 - val_mae: 0.2441
Epoch 112: early stopping
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - dmae: 67069.3906 - loss: 0.0270 - mae: 0.1203
Out[3]:
[0.02606853097677231, 64549.47265625, 0.11579488962888718]
In [4]:
# S = 3
model3 = WalmartModel(3, hidden_size=10)
model3.train(train, val, 'results/walmart3.weights.h5', Adam(1e-3), batch_size=BATCH_SIZE)
model3.evaluate(test)
Epoch 1/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 4s 43ms/step - dmae: 540936.6250 - loss: 1.2712 - mae: 0.9704 - val_dmae: 446060.9375 - val_loss: 0.8851 - val_mae: 0.8002
Epoch 2/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 526711.0000 - loss: 1.2101 - mae: 0.9449 - val_dmae: 417605.8438 - val_loss: 0.7719 - val_mae: 0.7491
Epoch 3/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 472037.7812 - loss: 0.9910 - mae: 0.8468 - val_dmae: 290335.1875 - val_loss: 0.3701 - val_mae: 0.5208
Epoch 4/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 257934.1094 - loss: 0.3433 - mae: 0.4627 - val_dmae: 186003.2031 - val_loss: 0.2247 - val_mae: 0.3337
Epoch 5/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 132825.7969 - loss: 0.1373 - mae: 0.2383 - val_dmae: 171562.5156 - val_loss: 0.1824 - val_mae: 0.3078
Epoch 6/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 122590.3438 - loss: 0.1296 - mae: 0.2199 - val_dmae: 168244.2969 - val_loss: 0.1790 - val_mae: 0.3018
Epoch 7/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 120699.1172 - loss: 0.1292 - mae: 0.2165 - val_dmae: 166717.6094 - val_loss: 0.1753 - val_mae: 0.2991
Epoch 8/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 119165.5938 - loss: 0.1258 - mae: 0.2138 - val_dmae: 165173.4531 - val_loss: 0.1717 - val_mae: 0.2963
Epoch 9/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 119174.4141 - loss: 0.1264 - mae: 0.2138 - val_dmae: 163359.1562 - val_loss: 0.1675 - val_mae: 0.2930
Epoch 10/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 118432.2266 - loss: 0.1264 - mae: 0.2125 - val_dmae: 162832.9062 - val_loss: 0.1660 - val_mae: 0.2921
Epoch 11/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 117244.1484 - loss: 0.1241 - mae: 0.2103 - val_dmae: 162737.7969 - val_loss: 0.1639 - val_mae: 0.2919
Epoch 12/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 120068.3359 - loss: 0.1260 - mae: 0.2154 - val_dmae: 162561.1719 - val_loss: 0.1632 - val_mae: 0.2916
Epoch 13/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 117621.6562 - loss: 0.1246 - mae: 0.2110 - val_dmae: 161742.9531 - val_loss: 0.1609 - val_mae: 0.2901
Epoch 14/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 118960.2812 - loss: 0.1226 - mae: 0.2134 - val_dmae: 161482.3906 - val_loss: 0.1603 - val_mae: 0.2897
Epoch 15/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 118165.0078 - loss: 0.1215 - mae: 0.2120 - val_dmae: 160821.7812 - val_loss: 0.1583 - val_mae: 0.2885
Epoch 16/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 117926.8594 - loss: 0.1220 - mae: 0.2115 - val_dmae: 160160.2031 - val_loss: 0.1579 - val_mae: 0.2873
Epoch 17/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 116689.5000 - loss: 0.1217 - mae: 0.2093 - val_dmae: 160171.0156 - val_loss: 0.1567 - val_mae: 0.2873
Epoch 18/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 117721.7969 - loss: 0.1217 - mae: 0.2112 - val_dmae: 159014.6875 - val_loss: 0.1535 - val_mae: 0.2853
Epoch 19/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 117070.3750 - loss: 0.1192 - mae: 0.2100 - val_dmae: 159033.2656 - val_loss: 0.1544 - val_mae: 0.2853
Epoch 20/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 116089.2422 - loss: 0.1179 - mae: 0.2083 - val_dmae: 158147.5156 - val_loss: 0.1514 - val_mae: 0.2837
Epoch 21/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 116260.5703 - loss: 0.1175 - mae: 0.2086 - val_dmae: 156836.6406 - val_loss: 0.1490 - val_mae: 0.2813
Epoch 22/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 116733.7031 - loss: 0.1175 - mae: 0.2094 - val_dmae: 156020.1406 - val_loss: 0.1466 - val_mae: 0.2799
Epoch 23/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 115904.0000 - loss: 0.1175 - mae: 0.2079 - val_dmae: 155950.4688 - val_loss: 0.1470 - val_mae: 0.2798
Epoch 24/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 116191.7578 - loss: 0.1182 - mae: 0.2084 - val_dmae: 155160.9844 - val_loss: 0.1458 - val_mae: 0.2783
Epoch 25/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 115599.6250 - loss: 0.1145 - mae: 0.2074 - val_dmae: 154370.4219 - val_loss: 0.1448 - val_mae: 0.2769
Epoch 26/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 115037.5391 - loss: 0.1174 - mae: 0.2064 - val_dmae: 153504.0938 - val_loss: 0.1440 - val_mae: 0.2754
Epoch 27/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112945.7500 - loss: 0.1118 - mae: 0.2026 - val_dmae: 152736.9688 - val_loss: 0.1438 - val_mae: 0.2740
Epoch 28/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113147.0781 - loss: 0.1107 - mae: 0.2030 - val_dmae: 151582.6719 - val_loss: 0.1406 - val_mae: 0.2719
Epoch 29/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113631.4766 - loss: 0.1121 - mae: 0.2038 - val_dmae: 150585.4688 - val_loss: 0.1406 - val_mae: 0.2701
Epoch 30/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112733.0781 - loss: 0.1119 - mae: 0.2022 - val_dmae: 149879.8594 - val_loss: 0.1394 - val_mae: 0.2689
Epoch 31/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112611.1797 - loss: 0.1105 - mae: 0.2020 - val_dmae: 148606.7500 - val_loss: 0.1372 - val_mae: 0.2666
Epoch 32/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111950.7891 - loss: 0.1089 - mae: 0.2008 - val_dmae: 147362.0312 - val_loss: 0.1373 - val_mae: 0.2644
Epoch 33/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111432.6562 - loss: 0.1095 - mae: 0.1999 - val_dmae: 146865.7812 - val_loss: 0.1354 - val_mae: 0.2635
Epoch 34/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112428.4688 - loss: 0.1088 - mae: 0.2017 - val_dmae: 145913.6562 - val_loss: 0.1349 - val_mae: 0.2618
Epoch 35/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111081.5469 - loss: 0.1066 - mae: 0.1993 - val_dmae: 145309.6562 - val_loss: 0.1339 - val_mae: 0.2607
Epoch 36/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110126.4844 - loss: 0.1060 - mae: 0.1976 - val_dmae: 143942.8750 - val_loss: 0.1329 - val_mae: 0.2582
Epoch 37/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111330.8828 - loss: 0.1068 - mae: 0.1997 - val_dmae: 143556.3750 - val_loss: 0.1332 - val_mae: 0.2575
Epoch 38/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 110325.1953 - loss: 0.1053 - mae: 0.1979 - val_dmae: 141895.3438 - val_loss: 0.1299 - val_mae: 0.2545
Epoch 39/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 111159.8906 - loss: 0.1054 - mae: 0.1994 - val_dmae: 140494.4688 - val_loss: 0.1277 - val_mae: 0.2520
Epoch 40/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 109931.8906 - loss: 0.1042 - mae: 0.1972 - val_dmae: 139570.3281 - val_loss: 0.1268 - val_mae: 0.2504
Epoch 41/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111023.9531 - loss: 0.1062 - mae: 0.1992 - val_dmae: 139635.0625 - val_loss: 0.1283 - val_mae: 0.2505
Epoch 42/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108282.9219 - loss: 0.1023 - mae: 0.1942 - val_dmae: 138786.0156 - val_loss: 0.1265 - val_mae: 0.2490
Epoch 43/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 108149.6328 - loss: 0.1018 - mae: 0.1940 - val_dmae: 137035.6250 - val_loss: 0.1244 - val_mae: 0.2458
Epoch 44/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108169.9219 - loss: 0.1028 - mae: 0.1940 - val_dmae: 137295.5312 - val_loss: 0.1251 - val_mae: 0.2463
Epoch 45/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107773.6016 - loss: 0.1010 - mae: 0.1933 - val_dmae: 135757.5000 - val_loss: 0.1234 - val_mae: 0.2435
Epoch 46/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107828.5312 - loss: 0.1023 - mae: 0.1934 - val_dmae: 135303.0625 - val_loss: 0.1226 - val_mae: 0.2427
Epoch 47/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 107727.1484 - loss: 0.1017 - mae: 0.1933 - val_dmae: 135700.1406 - val_loss: 0.1245 - val_mae: 0.2434
Epoch 48/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108498.3203 - loss: 0.1000 - mae: 0.1946 - val_dmae: 134909.6875 - val_loss: 0.1239 - val_mae: 0.2420
Epoch 49/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107736.9609 - loss: 0.0997 - mae: 0.1933 - val_dmae: 133203.0312 - val_loss: 0.1201 - val_mae: 0.2390
Epoch 50/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108150.8750 - loss: 0.0993 - mae: 0.1940 - val_dmae: 133000.3594 - val_loss: 0.1191 - val_mae: 0.2386
Epoch 51/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107397.0469 - loss: 0.0992 - mae: 0.1927 - val_dmae: 132878.0625 - val_loss: 0.1196 - val_mae: 0.2384
Epoch 52/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 109820.4297 - loss: 0.1019 - mae: 0.1970 - val_dmae: 131335.3594 - val_loss: 0.1181 - val_mae: 0.2356
Epoch 53/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107559.5000 - loss: 0.0980 - mae: 0.1930 - val_dmae: 132667.8906 - val_loss: 0.1199 - val_mae: 0.2380
Epoch 54/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106483.0859 - loss: 0.0995 - mae: 0.1910 - val_dmae: 131244.6406 - val_loss: 0.1177 - val_mae: 0.2354
Epoch 55/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107027.5000 - loss: 0.0969 - mae: 0.1920 - val_dmae: 130604.6719 - val_loss: 0.1171 - val_mae: 0.2343
Epoch 56/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106827.5938 - loss: 0.0976 - mae: 0.1916 - val_dmae: 131395.9531 - val_loss: 0.1184 - val_mae: 0.2357
Epoch 57/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107072.7266 - loss: 0.0979 - mae: 0.1921 - val_dmae: 129069.6094 - val_loss: 0.1142 - val_mae: 0.2315
Epoch 58/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 107479.7734 - loss: 0.0988 - mae: 0.1928 - val_dmae: 129324.2578 - val_loss: 0.1153 - val_mae: 0.2320
Epoch 59/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104877.4062 - loss: 0.0967 - mae: 0.1881 - val_dmae: 130815.4141 - val_loss: 0.1172 - val_mae: 0.2347
Epoch 60/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106237.4375 - loss: 0.0994 - mae: 0.1906 - val_dmae: 128380.0391 - val_loss: 0.1135 - val_mae: 0.2303
Epoch 61/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105664.3203 - loss: 0.0957 - mae: 0.1896 - val_dmae: 128674.1875 - val_loss: 0.1139 - val_mae: 0.2308
Epoch 62/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106641.8828 - loss: 0.0975 - mae: 0.1913 - val_dmae: 127305.0312 - val_loss: 0.1122 - val_mae: 0.2284
Epoch 63/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107236.2109 - loss: 0.0992 - mae: 0.1924 - val_dmae: 127473.6016 - val_loss: 0.1116 - val_mae: 0.2287
Epoch 64/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106514.9141 - loss: 0.0961 - mae: 0.1911 - val_dmae: 127776.3828 - val_loss: 0.1119 - val_mae: 0.2292
Epoch 65/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106053.3594 - loss: 0.0958 - mae: 0.1902 - val_dmae: 126682.3516 - val_loss: 0.1109 - val_mae: 0.2273
Epoch 66/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 105466.3984 - loss: 0.0949 - mae: 0.1892 - val_dmae: 126861.6484 - val_loss: 0.1102 - val_mae: 0.2276
Epoch 67/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 107242.6875 - loss: 0.0959 - mae: 0.1924 - val_dmae: 127207.3672 - val_loss: 0.1109 - val_mae: 0.2282
Epoch 68/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107251.6328 - loss: 0.0957 - mae: 0.1924 - val_dmae: 125882.2734 - val_loss: 0.1085 - val_mae: 0.2258
Epoch 69/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107183.1797 - loss: 0.0940 - mae: 0.1923 - val_dmae: 126138.2188 - val_loss: 0.1083 - val_mae: 0.2263
Epoch 70/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106752.0781 - loss: 0.0926 - mae: 0.1915 - val_dmae: 126654.4922 - val_loss: 0.1091 - val_mae: 0.2272
Epoch 71/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105356.8359 - loss: 0.0945 - mae: 0.1890 - val_dmae: 126322.1719 - val_loss: 0.1086 - val_mae: 0.2266
Epoch 72/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105719.0391 - loss: 0.0933 - mae: 0.1896 - val_dmae: 125724.5234 - val_loss: 0.1082 - val_mae: 0.2255
Epoch 73/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104421.3203 - loss: 0.0906 - mae: 0.1873 - val_dmae: 124994.4219 - val_loss: 0.1073 - val_mae: 0.2242
Epoch 74/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103898.9297 - loss: 0.0898 - mae: 0.1864 - val_dmae: 126199.6406 - val_loss: 0.1087 - val_mae: 0.2264
Epoch 75/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105222.2500 - loss: 0.0911 - mae: 0.1888 - val_dmae: 125876.5703 - val_loss: 0.1080 - val_mae: 0.2258
Epoch 76/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104338.0625 - loss: 0.0906 - mae: 0.1872 - val_dmae: 126101.1406 - val_loss: 0.1084 - val_mae: 0.2262
Epoch 77/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 102422.9766 - loss: 0.0876 - mae: 0.1837 - val_dmae: 124594.4141 - val_loss: 0.1065 - val_mae: 0.2235
Epoch 78/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103584.4844 - loss: 0.0885 - mae: 0.1858 - val_dmae: 124104.4922 - val_loss: 0.1053 - val_mae: 0.2226
Epoch 79/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104298.5938 - loss: 0.0881 - mae: 0.1871 - val_dmae: 124902.7422 - val_loss: 0.1058 - val_mae: 0.2241
Epoch 80/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 103117.6797 - loss: 0.0886 - mae: 0.1850 - val_dmae: 124353.8672 - val_loss: 0.1056 - val_mae: 0.2231
Epoch 81/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104999.2266 - loss: 0.0891 - mae: 0.1884 - val_dmae: 124839.6094 - val_loss: 0.1061 - val_mae: 0.2239
Epoch 82/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 102493.0156 - loss: 0.0851 - mae: 0.1839 - val_dmae: 125998.2734 - val_loss: 0.1073 - val_mae: 0.2260
Epoch 83/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103112.1172 - loss: 0.0856 - mae: 0.1850 - val_dmae: 124823.8359 - val_loss: 0.1056 - val_mae: 0.2239
Epoch 83: early stopping
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - dmae: 72043.2344 - loss: 0.0310 - mae: 0.1292
Out[4]:
[0.02758491039276123, 66173.328125, 0.11870791763067245]
In [5]:
# S = 4
model4 = WalmartModel(4, hidden_size=10)
model4.train(train, val, 'results/walmart3.weights.h5', Adam(1e-3), batch_size=BATCH_SIZE)
model4.evaluate(test)
Epoch 1/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 4s 44ms/step - dmae: 541183.5000 - loss: 1.2739 - mae: 0.9708 - val_dmae: 445947.1250 - val_loss: 0.8849 - val_mae: 0.8000
Epoch 2/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 523680.4062 - loss: 1.1994 - mae: 0.9394 - val_dmae: 389234.1562 - val_loss: 0.6675 - val_mae: 0.6982
Epoch 3/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 410993.5625 - loss: 0.7863 - mae: 0.7373 - val_dmae: 204479.6875 - val_loss: 0.2104 - val_mae: 0.3668
Epoch 4/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 140061.5156 - loss: 0.1481 - mae: 0.2513 - val_dmae: 179813.2812 - val_loss: 0.1822 - val_mae: 0.3226
Epoch 5/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 123100.4844 - loss: 0.1321 - mae: 0.2208 - val_dmae: 177829.4844 - val_loss: 0.1793 - val_mae: 0.3190
Epoch 6/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 121718.7812 - loss: 0.1309 - mae: 0.2184 - val_dmae: 176610.0781 - val_loss: 0.1757 - val_mae: 0.3168
Epoch 7/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 122855.4531 - loss: 0.1315 - mae: 0.2204 - val_dmae: 175641.4844 - val_loss: 0.1735 - val_mae: 0.3151
Epoch 8/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 121785.8672 - loss: 0.1301 - mae: 0.2185 - val_dmae: 175341.3594 - val_loss: 0.1713 - val_mae: 0.3145
Epoch 9/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 120890.7188 - loss: 0.1289 - mae: 0.2169 - val_dmae: 174643.0781 - val_loss: 0.1695 - val_mae: 0.3133
Epoch 10/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 121292.8984 - loss: 0.1299 - mae: 0.2176 - val_dmae: 174020.7812 - val_loss: 0.1675 - val_mae: 0.3122
Epoch 11/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 120828.5781 - loss: 0.1299 - mae: 0.2168 - val_dmae: 173717.3281 - val_loss: 0.1668 - val_mae: 0.3116
Epoch 12/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 120860.0391 - loss: 0.1286 - mae: 0.2168 - val_dmae: 173037.5781 - val_loss: 0.1651 - val_mae: 0.3104
Epoch 13/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 118227.3750 - loss: 0.1263 - mae: 0.2121 - val_dmae: 172579.0781 - val_loss: 0.1642 - val_mae: 0.3096
Epoch 14/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 120309.1562 - loss: 0.1275 - mae: 0.2158 - val_dmae: 172116.3594 - val_loss: 0.1632 - val_mae: 0.3088
Epoch 15/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 119245.2422 - loss: 0.1274 - mae: 0.2139 - val_dmae: 171584.2969 - val_loss: 0.1626 - val_mae: 0.3078
Epoch 16/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 119667.2344 - loss: 0.1261 - mae: 0.2147 - val_dmae: 170852.2812 - val_loss: 0.1609 - val_mae: 0.3065
Epoch 17/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 118587.4531 - loss: 0.1247 - mae: 0.2127 - val_dmae: 170126.2031 - val_loss: 0.1599 - val_mae: 0.3052
Epoch 18/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 119761.6016 - loss: 0.1256 - mae: 0.2148 - val_dmae: 169628.2500 - val_loss: 0.1588 - val_mae: 0.3043
Epoch 19/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 118999.5859 - loss: 0.1254 - mae: 0.2135 - val_dmae: 169241.3125 - val_loss: 0.1578 - val_mae: 0.3036
Epoch 20/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 120278.4688 - loss: 0.1257 - mae: 0.2158 - val_dmae: 168276.1250 - val_loss: 0.1565 - val_mae: 0.3019
Epoch 21/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 119082.6094 - loss: 0.1239 - mae: 0.2136 - val_dmae: 167657.5312 - val_loss: 0.1552 - val_mae: 0.3008
Epoch 22/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 118829.9531 - loss: 0.1235 - mae: 0.2132 - val_dmae: 166965.6875 - val_loss: 0.1547 - val_mae: 0.2995
Epoch 23/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 117828.6094 - loss: 0.1223 - mae: 0.2114 - val_dmae: 166416.9531 - val_loss: 0.1534 - val_mae: 0.2985
Epoch 24/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 118497.5547 - loss: 0.1225 - mae: 0.2126 - val_dmae: 165379.1250 - val_loss: 0.1522 - val_mae: 0.2967
Epoch 25/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 117032.6328 - loss: 0.1208 - mae: 0.2099 - val_dmae: 164990.1406 - val_loss: 0.1516 - val_mae: 0.2960
Epoch 26/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 118040.7656 - loss: 0.1230 - mae: 0.2118 - val_dmae: 163983.4375 - val_loss: 0.1509 - val_mae: 0.2942
Epoch 27/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 115586.5312 - loss: 0.1188 - mae: 0.2073 - val_dmae: 163506.5000 - val_loss: 0.1496 - val_mae: 0.2933
Epoch 28/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 116925.1016 - loss: 0.1226 - mae: 0.2098 - val_dmae: 162691.6562 - val_loss: 0.1489 - val_mae: 0.2919
Epoch 29/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 117944.3047 - loss: 0.1194 - mae: 0.2116 - val_dmae: 161717.6562 - val_loss: 0.1479 - val_mae: 0.2901
Epoch 30/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 116896.6641 - loss: 0.1212 - mae: 0.2097 - val_dmae: 160816.7969 - val_loss: 0.1455 - val_mae: 0.2885
Epoch 31/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 116105.2422 - loss: 0.1179 - mae: 0.2083 - val_dmae: 160315.1094 - val_loss: 0.1462 - val_mae: 0.2876
Epoch 32/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 116258.7734 - loss: 0.1181 - mae: 0.2086 - val_dmae: 159055.1719 - val_loss: 0.1445 - val_mae: 0.2853
Epoch 33/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 115883.6797 - loss: 0.1173 - mae: 0.2079 - val_dmae: 158377.6094 - val_loss: 0.1431 - val_mae: 0.2841
Epoch 34/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 115552.8828 - loss: 0.1167 - mae: 0.2073 - val_dmae: 157624.4062 - val_loss: 0.1428 - val_mae: 0.2828
Epoch 35/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114920.1016 - loss: 0.1162 - mae: 0.2062 - val_dmae: 156869.3438 - val_loss: 0.1410 - val_mae: 0.2814
Epoch 36/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 115326.1328 - loss: 0.1162 - mae: 0.2069 - val_dmae: 155840.2656 - val_loss: 0.1408 - val_mae: 0.2796
Epoch 37/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114686.2031 - loss: 0.1160 - mae: 0.2057 - val_dmae: 154683.8438 - val_loss: 0.1384 - val_mae: 0.2775
Epoch 38/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 115514.4844 - loss: 0.1158 - mae: 0.2072 - val_dmae: 153769.5781 - val_loss: 0.1385 - val_mae: 0.2758
Epoch 39/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114226.4062 - loss: 0.1153 - mae: 0.2049 - val_dmae: 153012.1719 - val_loss: 0.1362 - val_mae: 0.2745
Epoch 40/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113979.2031 - loss: 0.1135 - mae: 0.2045 - val_dmae: 152202.2812 - val_loss: 0.1354 - val_mae: 0.2730
Epoch 41/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113728.9062 - loss: 0.1115 - mae: 0.2040 - val_dmae: 151430.1094 - val_loss: 0.1353 - val_mae: 0.2716
Epoch 42/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 113138.7812 - loss: 0.1108 - mae: 0.2030 - val_dmae: 150110.1406 - val_loss: 0.1350 - val_mae: 0.2693
Epoch 43/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112732.0078 - loss: 0.1121 - mae: 0.2022 - val_dmae: 149602.0312 - val_loss: 0.1334 - val_mae: 0.2684
Epoch 44/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112953.9062 - loss: 0.1103 - mae: 0.2026 - val_dmae: 148456.2188 - val_loss: 0.1319 - val_mae: 0.2663
Epoch 45/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112995.1875 - loss: 0.1098 - mae: 0.2027 - val_dmae: 147468.7188 - val_loss: 0.1319 - val_mae: 0.2645
Epoch 46/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112818.1875 - loss: 0.1119 - mae: 0.2024 - val_dmae: 146326.2031 - val_loss: 0.1303 - val_mae: 0.2625
Epoch 47/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111875.6641 - loss: 0.1095 - mae: 0.2007 - val_dmae: 145698.6250 - val_loss: 0.1292 - val_mae: 0.2614
Epoch 48/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110506.3906 - loss: 0.1053 - mae: 0.1982 - val_dmae: 144169.7656 - val_loss: 0.1285 - val_mae: 0.2586
Epoch 49/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112231.7969 - loss: 0.1075 - mae: 0.2013 - val_dmae: 143225.8281 - val_loss: 0.1270 - val_mae: 0.2569
Epoch 50/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110201.1484 - loss: 0.1069 - mae: 0.1977 - val_dmae: 142438.1719 - val_loss: 0.1259 - val_mae: 0.2555
Epoch 51/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 110309.5938 - loss: 0.1065 - mae: 0.1979 - val_dmae: 141394.1406 - val_loss: 0.1253 - val_mae: 0.2536
Epoch 52/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108985.8203 - loss: 0.1044 - mae: 0.1955 - val_dmae: 140468.4688 - val_loss: 0.1243 - val_mae: 0.2520
Epoch 53/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111539.7656 - loss: 0.1061 - mae: 0.2001 - val_dmae: 139558.6094 - val_loss: 0.1235 - val_mae: 0.2504
Epoch 54/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 109181.4688 - loss: 0.1025 - mae: 0.1959 - val_dmae: 138296.8281 - val_loss: 0.1217 - val_mae: 0.2481
Epoch 55/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 109968.4609 - loss: 0.1032 - mae: 0.1973 - val_dmae: 137570.2344 - val_loss: 0.1217 - val_mae: 0.2468
Epoch 56/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 107895.3750 - loss: 0.0994 - mae: 0.1936 - val_dmae: 136646.6875 - val_loss: 0.1208 - val_mae: 0.2451
Epoch 57/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107892.7422 - loss: 0.1010 - mae: 0.1935 - val_dmae: 136022.4531 - val_loss: 0.1198 - val_mae: 0.2440
Epoch 58/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108493.5859 - loss: 0.1007 - mae: 0.1946 - val_dmae: 134835.5938 - val_loss: 0.1189 - val_mae: 0.2419
Epoch 59/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 109186.6172 - loss: 0.1018 - mae: 0.1959 - val_dmae: 134169.0625 - val_loss: 0.1181 - val_mae: 0.2407
Epoch 60/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 109062.6641 - loss: 0.1026 - mae: 0.1956 - val_dmae: 133646.3125 - val_loss: 0.1180 - val_mae: 0.2397
Epoch 61/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107113.2734 - loss: 0.0979 - mae: 0.1921 - val_dmae: 132571.2188 - val_loss: 0.1162 - val_mae: 0.2378
Epoch 62/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105384.2266 - loss: 0.0981 - mae: 0.1890 - val_dmae: 131826.2031 - val_loss: 0.1162 - val_mae: 0.2365
Epoch 63/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106014.0703 - loss: 0.0974 - mae: 0.1902 - val_dmae: 131224.7031 - val_loss: 0.1149 - val_mae: 0.2354
Epoch 64/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106483.6250 - loss: 0.0976 - mae: 0.1910 - val_dmae: 130980.0000 - val_loss: 0.1144 - val_mae: 0.2350
Epoch 65/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 105500.1406 - loss: 0.0961 - mae: 0.1893 - val_dmae: 129380.7969 - val_loss: 0.1125 - val_mae: 0.2321
Epoch 66/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107148.6094 - loss: 0.0964 - mae: 0.1922 - val_dmae: 129163.6406 - val_loss: 0.1125 - val_mae: 0.2317
Epoch 67/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106408.5000 - loss: 0.0970 - mae: 0.1909 - val_dmae: 129165.8047 - val_loss: 0.1120 - val_mae: 0.2317
Epoch 68/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104381.0312 - loss: 0.0939 - mae: 0.1872 - val_dmae: 127756.3828 - val_loss: 0.1108 - val_mae: 0.2292
Epoch 69/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105224.0547 - loss: 0.0935 - mae: 0.1888 - val_dmae: 126828.8438 - val_loss: 0.1095 - val_mae: 0.2275
Epoch 70/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104486.1484 - loss: 0.0930 - mae: 0.1874 - val_dmae: 126969.3359 - val_loss: 0.1094 - val_mae: 0.2278
Epoch 71/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 103941.8906 - loss: 0.0921 - mae: 0.1865 - val_dmae: 126879.2422 - val_loss: 0.1091 - val_mae: 0.2276
Epoch 72/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105350.7734 - loss: 0.0933 - mae: 0.1890 - val_dmae: 126084.0000 - val_loss: 0.1085 - val_mae: 0.2262
Epoch 73/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104282.8750 - loss: 0.0908 - mae: 0.1871 - val_dmae: 126411.7500 - val_loss: 0.1079 - val_mae: 0.2268
Epoch 74/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 104767.8828 - loss: 0.0938 - mae: 0.1879 - val_dmae: 124989.8672 - val_loss: 0.1067 - val_mae: 0.2242
Epoch 75/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104791.2188 - loss: 0.0916 - mae: 0.1880 - val_dmae: 124792.5938 - val_loss: 0.1059 - val_mae: 0.2239
Epoch 76/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104388.0078 - loss: 0.0911 - mae: 0.1873 - val_dmae: 125464.5000 - val_loss: 0.1064 - val_mae: 0.2251
Epoch 77/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104210.6797 - loss: 0.0914 - mae: 0.1869 - val_dmae: 126268.2969 - val_loss: 0.1070 - val_mae: 0.2265
Epoch 78/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103442.5859 - loss: 0.0917 - mae: 0.1856 - val_dmae: 125105.3750 - val_loss: 0.1058 - val_mae: 0.2244
Epoch 79/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103514.8906 - loss: 0.0897 - mae: 0.1857 - val_dmae: 123832.4219 - val_loss: 0.1045 - val_mae: 0.2221
Epoch 80/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103531.8594 - loss: 0.0920 - mae: 0.1857 - val_dmae: 123028.3438 - val_loss: 0.1036 - val_mae: 0.2207
Epoch 81/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103358.8594 - loss: 0.0879 - mae: 0.1854 - val_dmae: 123414.1641 - val_loss: 0.1035 - val_mae: 0.2214
Epoch 82/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103729.3047 - loss: 0.0902 - mae: 0.1861 - val_dmae: 122685.9375 - val_loss: 0.1031 - val_mae: 0.2201
Epoch 83/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103041.3906 - loss: 0.0902 - mae: 0.1848 - val_dmae: 123704.9766 - val_loss: 0.1038 - val_mae: 0.2219
Epoch 84/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102791.0625 - loss: 0.0872 - mae: 0.1844 - val_dmae: 125045.1719 - val_loss: 0.1050 - val_mae: 0.2243
Epoch 85/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103251.6406 - loss: 0.0866 - mae: 0.1852 - val_dmae: 124456.6250 - val_loss: 0.1044 - val_mae: 0.2233
Epoch 86/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 101863.6719 - loss: 0.0857 - mae: 0.1827 - val_dmae: 123422.0469 - val_loss: 0.1035 - val_mae: 0.2214
Epoch 87/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104849.0078 - loss: 0.0879 - mae: 0.1881 - val_dmae: 124842.6250 - val_loss: 0.1044 - val_mae: 0.2240
Epoch 87: early stopping
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - dmae: 78321.1094 - loss: 0.0377 - mae: 0.1405
Out[5]:
[0.03053026646375656, 68176.4375, 0.12230127304792404]

The baseline network achieves 64k, 66k and 68k with a sequence length $S$ of 2, 3 and 4, respectively. Theoretically, having more information in the input of the model (e.g. the model with $S=4$ has more information than the first model with $S=2$) should retrieve at least the same results than simpler input representations. In practice, optimizing networks with a lot of input noise (data that has no information to predict the target) with no regularization techniques is extremely hard and would require a considerable amount of data to ensure the generalization of the model. In our case, from the results obtained with the baseline model, it seems that introducing more than $S=2$ past information does not help the network to predict the target outcome, so it should be enough to use sequences of length 2. For the next models we fixed this hyperparameter to $S=2$.

In [35]:
plot_series(model2, [train, val, test], title='Prediction with S=2').show()
plot_series(model3, [train, val, test], title='Prediction with S=3').show()
plot_series(model4, [train, val, test], title='Prediction with S=4').show()
2024-04-11 13:11:38.098054: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 13:11:43.823147: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 13:11:50.085394: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 13:11:56.094750: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 13:12:02.430470: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 13:12:08.287126: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 13:12:13.957533: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 13:12:20.489455: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 13:12:26.115864: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence

To better show the performance disparity between different models we printed the absolute error of the three models in a single plot. Note that the model with $S=2$ gets a higher error (specially in outlier observations) and the model with $S=4$ is the closest to the zero-line, indicating a lower absolute error.

In [36]:
plot_errors([model2, model3, model4], [train, val, test])
2024-04-11 13:12:32.828964: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 13:12:38.511505: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 13:12:45.043433: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 13:12:50.694314: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 13:12:57.079231: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 13:13:02.855903: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 13:13:09.154021: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 13:13:14.980685: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 13:13:20.412501: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence

Increasing the complexity of the model¶

Once we got our baseline results, we increased the complexity of the model by modifying the hyperparameters of the network (see model.py).

  • base_layer: The recurrent base cell of the encoder $\mathcal{E}$. There are available two options: the LSTM and the GRU. Althought the LSTM layer is more popular than GRU (specifically in NLP tasks), the GRU has its advantages over the LSTM (e.g. it has less parameters) and it is still used in other DL applications. By default, each base layer is a LSTM.
  • num_encoder_layers ($\ell$): Number of layers in the encoder $\mathcal{E}$.
  • num_decoder_layers ($\varphi$): Number of layer in the decoder.
  • hidden_size ($d_h$): Hidden dimension of the encoder $\mathcal{E}$.
  • regularizer: Kernel and bias regularizer in the hidden layers. By default there is no regularization.
  • initializer: Weight initialization. All biases are initialized from zero. By default, kernels are initialized following a random normal distribution.
  • bidirectional: Whether to process left-to-right and right-to-left the input sequence or only left-to-right. By default, the processing is bidirectional, so the left-to-right and righ-to-left information is concatenated to produce a unique contextualization of each timestep observation.
  • dropout: Dropout value in the latent space of the neural architecture.

The next cell code increaes the dimensionality of the model upon $d_h=50$ and uses 3 layers in the encoder (maintaining 2 layers in the decoder). We tested two different architectures: the first one uses the LSTM cell again and the second replaces the LSTM layer by the GRU cell.

In [38]:
model_lstm = WalmartModel(2, hidden_size=50, num_encoder_layers=3)
model_lstm.train(train, val, 'results/walmart3.weights.h5', Adam(1e-3), batch_size=BATCH_SIZE)
model_lstm.evaluate(test)
Epoch 1/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 7s 55ms/step - dmae: 541028.4375 - loss: 1.2628 - mae: 0.9705 - val_dmae: 425864.0312 - val_loss: 0.8837 - val_mae: 0.7640
Epoch 2/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 356509.9062 - loss: 0.6480 - mae: 0.6395 - val_dmae: 177930.2500 - val_loss: 0.2288 - val_mae: 0.3192
Epoch 3/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 116861.7734 - loss: 0.1174 - mae: 0.2096 - val_dmae: 167336.6250 - val_loss: 0.2143 - val_mae: 0.3002
Epoch 4/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 114137.8984 - loss: 0.1138 - mae: 0.2048 - val_dmae: 168187.6406 - val_loss: 0.2091 - val_mae: 0.3017
Epoch 5/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 114569.1953 - loss: 0.1126 - mae: 0.2055 - val_dmae: 168283.5312 - val_loss: 0.2067 - val_mae: 0.3019
Epoch 6/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 116331.8828 - loss: 0.1143 - mae: 0.2087 - val_dmae: 168518.9375 - val_loss: 0.2040 - val_mae: 0.3023
Epoch 7/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 118146.3594 - loss: 0.1152 - mae: 0.2119 - val_dmae: 167939.1719 - val_loss: 0.2013 - val_mae: 0.3013
Epoch 8/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 116887.8828 - loss: 0.1115 - mae: 0.2097 - val_dmae: 167847.4062 - val_loss: 0.1986 - val_mae: 0.3011
Epoch 8: early stopping
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - dmae: 78822.9453 - loss: 0.0351 - mae: 0.1414
Out[38]:
[0.03196907415986061, 73991.0546875, 0.1327320635318756]
In [39]:
model_gru = WalmartModel(2, base_layer=GRU, hidden_size=50, num_encoder_layers=3)
model_gru.train(train, val, 'results/walmart3.weights.h5', Adam(1e-3), batch_size=BATCH_SIZE)
model_gru.evaluate(test)
Epoch 1/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 7s 53ms/step - dmae: 525163.6875 - loss: 1.2022 - mae: 0.9421 - val_dmae: 189033.2812 - val_loss: 0.2542 - val_mae: 0.3391
Epoch 2/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 171360.1094 - loss: 0.1827 - mae: 0.3074 - val_dmae: 190236.4844 - val_loss: 0.2677 - val_mae: 0.3413
Epoch 3/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 109318.0156 - loss: 0.1113 - mae: 0.1961 - val_dmae: 168021.2969 - val_loss: 0.2158 - val_mae: 0.3014
Epoch 4/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 109797.6641 - loss: 0.1098 - mae: 0.1970 - val_dmae: 165004.7188 - val_loss: 0.2051 - val_mae: 0.2960
Epoch 5/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 109416.9141 - loss: 0.1070 - mae: 0.1963 - val_dmae: 164106.9375 - val_loss: 0.1972 - val_mae: 0.2944
Epoch 6/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 110591.1719 - loss: 0.1061 - mae: 0.1984 - val_dmae: 163802.1250 - val_loss: 0.1911 - val_mae: 0.2938
Epoch 7/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 112792.9453 - loss: 0.1058 - mae: 0.2023 - val_dmae: 163523.4375 - val_loss: 0.1854 - val_mae: 0.2933
Epoch 8/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 111071.8203 - loss: 0.1022 - mae: 0.1993 - val_dmae: 162903.8750 - val_loss: 0.1811 - val_mae: 0.2922
Epoch 9/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 112165.5625 - loss: 0.1027 - mae: 0.2012 - val_dmae: 162133.5625 - val_loss: 0.1781 - val_mae: 0.2909
Epoch 10/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 112104.1328 - loss: 0.1014 - mae: 0.2011 - val_dmae: 161738.9062 - val_loss: 0.1760 - val_mae: 0.2901
Epoch 11/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 112796.5078 - loss: 0.1004 - mae: 0.2023 - val_dmae: 159849.2812 - val_loss: 0.1715 - val_mae: 0.2868
Epoch 12/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 111312.7969 - loss: 0.0995 - mae: 0.1997 - val_dmae: 159882.1406 - val_loss: 0.1706 - val_mae: 0.2868
Epoch 13/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 111637.8594 - loss: 0.0989 - mae: 0.2003 - val_dmae: 159171.9844 - val_loss: 0.1684 - val_mae: 0.2855
Epoch 14/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 110569.5859 - loss: 0.0970 - mae: 0.1984 - val_dmae: 157746.8281 - val_loss: 0.1655 - val_mae: 0.2830
Epoch 15/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 109712.0547 - loss: 0.0950 - mae: 0.1968 - val_dmae: 157215.9844 - val_loss: 0.1639 - val_mae: 0.2820
Epoch 16/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 109980.0234 - loss: 0.0952 - mae: 0.1973 - val_dmae: 155647.9062 - val_loss: 0.1614 - val_mae: 0.2792
Epoch 17/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 108776.5781 - loss: 0.0916 - mae: 0.1951 - val_dmae: 153423.2344 - val_loss: 0.1576 - val_mae: 0.2752
Epoch 18/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 108486.1719 - loss: 0.0917 - mae: 0.1946 - val_dmae: 154014.7031 - val_loss: 0.1574 - val_mae: 0.2763
Epoch 19/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 107799.8594 - loss: 0.0901 - mae: 0.1934 - val_dmae: 152414.8906 - val_loss: 0.1540 - val_mae: 0.2734
Epoch 20/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 106329.3672 - loss: 0.0884 - mae: 0.1907 - val_dmae: 151545.2188 - val_loss: 0.1508 - val_mae: 0.2719
Epoch 21/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 105966.0391 - loss: 0.0869 - mae: 0.1901 - val_dmae: 149537.7188 - val_loss: 0.1474 - val_mae: 0.2683
Epoch 22/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 104388.3047 - loss: 0.0846 - mae: 0.1873 - val_dmae: 149337.6406 - val_loss: 0.1454 - val_mae: 0.2679
Epoch 23/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 103674.5234 - loss: 0.0836 - mae: 0.1860 - val_dmae: 147303.0625 - val_loss: 0.1416 - val_mae: 0.2642
Epoch 24/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 104319.8047 - loss: 0.0829 - mae: 0.1871 - val_dmae: 145766.2188 - val_loss: 0.1394 - val_mae: 0.2615
Epoch 25/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 104076.4453 - loss: 0.0818 - mae: 0.1867 - val_dmae: 146061.2500 - val_loss: 0.1388 - val_mae: 0.2620
Epoch 26/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 101932.6719 - loss: 0.0803 - mae: 0.1829 - val_dmae: 144234.1719 - val_loss: 0.1358 - val_mae: 0.2587
Epoch 27/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 102819.5078 - loss: 0.0808 - mae: 0.1844 - val_dmae: 143184.4844 - val_loss: 0.1350 - val_mae: 0.2569
Epoch 28/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 100399.2109 - loss: 0.0767 - mae: 0.1801 - val_dmae: 142360.8281 - val_loss: 0.1328 - val_mae: 0.2554
Epoch 29/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 103223.0391 - loss: 0.0802 - mae: 0.1852 - val_dmae: 139313.1094 - val_loss: 0.1286 - val_mae: 0.2499
Epoch 30/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 102348.4609 - loss: 0.0784 - mae: 0.1836 - val_dmae: 139132.4219 - val_loss: 0.1271 - val_mae: 0.2496
Epoch 31/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 102387.0234 - loss: 0.0779 - mae: 0.1837 - val_dmae: 138350.6875 - val_loss: 0.1258 - val_mae: 0.2482
Epoch 32/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 100219.6094 - loss: 0.0758 - mae: 0.1798 - val_dmae: 138669.4375 - val_loss: 0.1257 - val_mae: 0.2488
Epoch 33/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 100385.4297 - loss: 0.0747 - mae: 0.1801 - val_dmae: 138432.5469 - val_loss: 0.1242 - val_mae: 0.2483
Epoch 34/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 99647.8203 - loss: 0.0745 - mae: 0.1788 - val_dmae: 136882.0156 - val_loss: 0.1215 - val_mae: 0.2456
Epoch 35/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 99347.9141 - loss: 0.0735 - mae: 0.1782 - val_dmae: 135133.5781 - val_loss: 0.1195 - val_mae: 0.2424
Epoch 36/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 98495.4219 - loss: 0.0724 - mae: 0.1767 - val_dmae: 136209.8594 - val_loss: 0.1211 - val_mae: 0.2443
Epoch 37/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 97213.7422 - loss: 0.0705 - mae: 0.1744 - val_dmae: 134411.0781 - val_loss: 0.1173 - val_mae: 0.2411
Epoch 38/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 48ms/step - dmae: 97048.8359 - loss: 0.0699 - mae: 0.1741 - val_dmae: 133364.7656 - val_loss: 0.1166 - val_mae: 0.2392
Epoch 39/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 97005.8438 - loss: 0.0709 - mae: 0.1740 - val_dmae: 132827.4688 - val_loss: 0.1148 - val_mae: 0.2383
Epoch 40/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 97565.8516 - loss: 0.0707 - mae: 0.1750 - val_dmae: 133914.7031 - val_loss: 0.1168 - val_mae: 0.2402
Epoch 41/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 97036.3672 - loss: 0.0704 - mae: 0.1741 - val_dmae: 133973.3125 - val_loss: 0.1152 - val_mae: 0.2403
Epoch 42/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 96758.5781 - loss: 0.0687 - mae: 0.1736 - val_dmae: 133312.0625 - val_loss: 0.1144 - val_mae: 0.2391
Epoch 43/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 35ms/step - dmae: 98092.6172 - loss: 0.0705 - mae: 0.1760 - val_dmae: 130319.1562 - val_loss: 0.1114 - val_mae: 0.2338
Epoch 44/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 95912.1953 - loss: 0.0677 - mae: 0.1721 - val_dmae: 131547.6094 - val_loss: 0.1137 - val_mae: 0.2360
Epoch 45/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 95580.4609 - loss: 0.0667 - mae: 0.1715 - val_dmae: 131305.7500 - val_loss: 0.1124 - val_mae: 0.2355
Epoch 46/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 96280.3750 - loss: 0.0685 - mae: 0.1727 - val_dmae: 133229.5938 - val_loss: 0.1140 - val_mae: 0.2390
Epoch 47/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 97576.2266 - loss: 0.0692 - mae: 0.1750 - val_dmae: 130656.0078 - val_loss: 0.1115 - val_mae: 0.2344
Epoch 48/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 96363.0391 - loss: 0.0674 - mae: 0.1729 - val_dmae: 130564.7422 - val_loss: 0.1097 - val_mae: 0.2342
Epoch 48: early stopping
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - dmae: 72603.0781 - loss: 0.0325 - mae: 0.1302
Out[39]:
[0.027373870834708214, 64881.5390625, 0.11639059334993362]

We see that the results with the LSTM cell (73k) are worse than those with the GRU (64k). In the original paper of the LSTM (Hochreiter and Schmidhuber (1997)) authors describe that the LSTM cell is able to better contextualize longer sequences thanks to the three gates that control which information is maintained and forgotten. In other fields where sequences are longer (e.g. in NLP where we expect sentences to be conformed by 10-20 words) the LSTMs considerably outperform GRUs. In this dataset, since the sequence length is fixed to $S=2$, we see no significant diference between this two cells. The GRU seems to retrieve slightly better results (probably due to the lower number of parameters and then the less bias to overfitting) than the LSTM.

As a final improvement in our architecture, we can enable the option of bidirectional processing in the recurrent layers. The bidirectional processing consists of learning the recurrent information of an input sequence from left-to-right and right-to-left, and concatenating the hidden contextualizations to return a new sequence contextualization with past and future information. The bidirectionality has demonstrated a considerable improvement in recurrent layers since it allows the network to contextualize current information with future observations.

The next cell executes a bidirectional GRU-based encoder with 2-stacked FFNs in the decoder, training with a smaller learning rate to help the network to smoothly optimize its weights.

In [26]:
model_lstm = WalmartModel(2, base_layer=GRU, hidden_size=50, num_encoder_layers=3, dropout=0.1, bidirectional=True)
model_lstm.train(train, val, 'results/walmart3.weights.h5', Adam(5e-4), batch_size=BATCH_SIZE)
model_lstm.evaluate(test)
Epoch 1/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 11s 61ms/step - dmae: 511694.0312 - loss: 1.1551 - mae: 0.9179 - val_dmae: 189024.5156 - val_loss: 0.2542 - val_mae: 0.3391
Epoch 2/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 142497.4375 - loss: 0.1530 - mae: 0.2556 - val_dmae: 186431.4219 - val_loss: 0.2618 - val_mae: 0.3344
Epoch 3/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 108881.1016 - loss: 0.1089 - mae: 0.1953 - val_dmae: 174624.7188 - val_loss: 0.2308 - val_mae: 0.3133
Epoch 4/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 107733.7578 - loss: 0.1084 - mae: 0.1933 - val_dmae: 172770.8125 - val_loss: 0.2240 - val_mae: 0.3099
Epoch 5/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 107267.8594 - loss: 0.1057 - mae: 0.1924 - val_dmae: 171325.6094 - val_loss: 0.2185 - val_mae: 0.3073
Epoch 6/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 108637.7891 - loss: 0.1061 - mae: 0.1949 - val_dmae: 169672.3750 - val_loss: 0.2139 - val_mae: 0.3044
Epoch 7/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 108563.2109 - loss: 0.1049 - mae: 0.1948 - val_dmae: 168942.3594 - val_loss: 0.2103 - val_mae: 0.3031
Epoch 8/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 109264.6172 - loss: 0.1058 - mae: 0.1960 - val_dmae: 167967.6250 - val_loss: 0.2068 - val_mae: 0.3013
Epoch 9/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 109888.1797 - loss: 0.1050 - mae: 0.1971 - val_dmae: 167338.2500 - val_loss: 0.2035 - val_mae: 0.3002
Epoch 10/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 110322.7031 - loss: 0.1040 - mae: 0.1979 - val_dmae: 166839.9688 - val_loss: 0.2007 - val_mae: 0.2993
Epoch 11/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 111002.6016 - loss: 0.1040 - mae: 0.1991 - val_dmae: 166145.4375 - val_loss: 0.1969 - val_mae: 0.2980
Epoch 12/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 111035.5859 - loss: 0.1032 - mae: 0.1992 - val_dmae: 165776.1094 - val_loss: 0.1946 - val_mae: 0.2974
Epoch 13/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 111394.9844 - loss: 0.1029 - mae: 0.1998 - val_dmae: 165218.1719 - val_loss: 0.1916 - val_mae: 0.2964
Epoch 14/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 111347.3828 - loss: 0.1022 - mae: 0.1997 - val_dmae: 164132.3594 - val_loss: 0.1886 - val_mae: 0.2944
Epoch 15/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110852.0625 - loss: 0.1001 - mae: 0.1989 - val_dmae: 164826.7031 - val_loss: 0.1868 - val_mae: 0.2957
Epoch 16/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 111642.8672 - loss: 0.1013 - mae: 0.2003 - val_dmae: 163517.5781 - val_loss: 0.1834 - val_mae: 0.2933
Epoch 17/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110185.2578 - loss: 0.0988 - mae: 0.1977 - val_dmae: 162725.9375 - val_loss: 0.1807 - val_mae: 0.2919
Epoch 18/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110214.9531 - loss: 0.0973 - mae: 0.1977 - val_dmae: 162752.4219 - val_loss: 0.1786 - val_mae: 0.2920
Epoch 19/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 110505.2734 - loss: 0.0977 - mae: 0.1982 - val_dmae: 161590.3594 - val_loss: 0.1752 - val_mae: 0.2899
Epoch 20/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 110159.6094 - loss: 0.0953 - mae: 0.1976 - val_dmae: 160823.5000 - val_loss: 0.1723 - val_mae: 0.2885
Epoch 21/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 109186.1094 - loss: 0.0936 - mae: 0.1959 - val_dmae: 159566.3281 - val_loss: 0.1683 - val_mae: 0.2862
Epoch 22/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 108622.1641 - loss: 0.0918 - mae: 0.1949 - val_dmae: 158404.7500 - val_loss: 0.1638 - val_mae: 0.2842
Epoch 23/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108196.7969 - loss: 0.0905 - mae: 0.1941 - val_dmae: 158937.7656 - val_loss: 0.1609 - val_mae: 0.2851
Epoch 24/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 107088.9844 - loss: 0.0880 - mae: 0.1921 - val_dmae: 158154.9062 - val_loss: 0.1579 - val_mae: 0.2837
Epoch 25/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 105292.7422 - loss: 0.0851 - mae: 0.1889 - val_dmae: 157132.9062 - val_loss: 0.1550 - val_mae: 0.2819
Epoch 26/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 105309.1875 - loss: 0.0841 - mae: 0.1889 - val_dmae: 156571.4375 - val_loss: 0.1535 - val_mae: 0.2809
Epoch 27/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 103167.1172 - loss: 0.0806 - mae: 0.1851 - val_dmae: 151948.5938 - val_loss: 0.1486 - val_mae: 0.2726
Epoch 28/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 100862.3594 - loss: 0.0779 - mae: 0.1809 - val_dmae: 152059.4062 - val_loss: 0.1477 - val_mae: 0.2728
Epoch 29/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 100803.9766 - loss: 0.0778 - mae: 0.1808 - val_dmae: 150152.9688 - val_loss: 0.1442 - val_mae: 0.2694
Epoch 30/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 98984.7422 - loss: 0.0757 - mae: 0.1776 - val_dmae: 148794.4375 - val_loss: 0.1423 - val_mae: 0.2669
Epoch 31/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 97321.9062 - loss: 0.0732 - mae: 0.1746 - val_dmae: 148345.7031 - val_loss: 0.1397 - val_mae: 0.2661
Epoch 32/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 97453.7578 - loss: 0.0725 - mae: 0.1748 - val_dmae: 146179.5938 - val_loss: 0.1370 - val_mae: 0.2622
Epoch 33/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 96209.7656 - loss: 0.0715 - mae: 0.1726 - val_dmae: 145746.8906 - val_loss: 0.1345 - val_mae: 0.2615
Epoch 34/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 96604.7344 - loss: 0.0714 - mae: 0.1733 - val_dmae: 143477.1875 - val_loss: 0.1311 - val_mae: 0.2574
Epoch 35/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 96501.2734 - loss: 0.0699 - mae: 0.1731 - val_dmae: 143601.0781 - val_loss: 0.1310 - val_mae: 0.2576
Epoch 36/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 95101.6719 - loss: 0.0683 - mae: 0.1706 - val_dmae: 142305.2656 - val_loss: 0.1277 - val_mae: 0.2553
Epoch 37/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 95438.9453 - loss: 0.0676 - mae: 0.1712 - val_dmae: 142679.8125 - val_loss: 0.1265 - val_mae: 0.2560
Epoch 38/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 94747.6562 - loss: 0.0663 - mae: 0.1700 - val_dmae: 137811.0156 - val_loss: 0.1201 - val_mae: 0.2472
Epoch 39/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 93081.6562 - loss: 0.0651 - mae: 0.1670 - val_dmae: 139094.8281 - val_loss: 0.1209 - val_mae: 0.2495
Epoch 40/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 34ms/step - dmae: 94390.5391 - loss: 0.0659 - mae: 0.1693 - val_dmae: 135983.8594 - val_loss: 0.1169 - val_mae: 0.2439
Epoch 41/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 92927.3203 - loss: 0.0644 - mae: 0.1667 - val_dmae: 140755.0312 - val_loss: 0.1227 - val_mae: 0.2525
Epoch 42/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 94259.3516 - loss: 0.0654 - mae: 0.1691 - val_dmae: 133848.7500 - val_loss: 0.1140 - val_mae: 0.2401
Epoch 43/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 93657.9922 - loss: 0.0650 - mae: 0.1680 - val_dmae: 137869.3594 - val_loss: 0.1192 - val_mae: 0.2473
Epoch 44/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 92605.9922 - loss: 0.0634 - mae: 0.1661 - val_dmae: 134989.3438 - val_loss: 0.1148 - val_mae: 0.2422
Epoch 45/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 92207.6094 - loss: 0.0634 - mae: 0.1654 - val_dmae: 135320.2812 - val_loss: 0.1154 - val_mae: 0.2428
Epoch 46/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 93152.2578 - loss: 0.0638 - mae: 0.1671 - val_dmae: 130911.6562 - val_loss: 0.1084 - val_mae: 0.2348
Epoch 47/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 92816.6250 - loss: 0.0637 - mae: 0.1665 - val_dmae: 131079.3750 - val_loss: 0.1083 - val_mae: 0.2351
Epoch 48/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 91563.8828 - loss: 0.0619 - mae: 0.1643 - val_dmae: 135479.6406 - val_loss: 0.1136 - val_mae: 0.2430
Epoch 49/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 91637.3594 - loss: 0.0623 - mae: 0.1644 - val_dmae: 135391.9844 - val_loss: 0.1133 - val_mae: 0.2429
Epoch 50/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 91618.4531 - loss: 0.0623 - mae: 0.1644 - val_dmae: 139915.0625 - val_loss: 0.1206 - val_mae: 0.2510
Epoch 51/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 92722.4375 - loss: 0.0629 - mae: 0.1663 - val_dmae: 134980.9062 - val_loss: 0.1145 - val_mae: 0.2421
Epoch 51: early stopping
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - dmae: 69979.5078 - loss: 0.0293 - mae: 0.1255
Out[26]:
[0.02632186934351921, 64908.85546875, 0.1164395734667778]

The bidirectional GRU achieves a better MAE than the LSTM-based model but does not outperform the unidirectional GRU. The main reason of why bidirectionality does not improve baseline results is explained again by (1) the sequence length of the data stream and (2) the information collected in our dataset. Bidirectionality usually works well with longer sequences where future information is important to give a meaning to the whole sequence (e.g. in natural language). In this case, the sequences are shorter ($S=2$) and left-to-right processing makes more sense than bidirectionality since in the real environment the data stream is also generated from left to right and the target is always a future outcome from the previous input observations.

As a final improvement of our network, now that it has been demonstrated that the unidirectional GRU cell is the best option as the base recurrent layer, we increased again the sequence length to $S=3$ to see if the architecture can be further improved:

In [28]:
model = WalmartModel(3, base_layer=GRU, hidden_size=50, num_encoder_layers=3, activation='relu')
model.train(train, val, 'results/walmart3.weights.h5', Adam(1e-4), batch_size=BATCH_SIZE)
model.evaluate(test)
Epoch 1/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 6s 48ms/step - dmae: 540893.8750 - loss: 1.2754 - mae: 0.9703 - val_dmae: 449361.0312 - val_loss: 0.9042 - val_mae: 0.8061
Epoch 2/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 538479.1875 - loss: 1.2653 - mae: 0.9660 - val_dmae: 447091.1875 - val_loss: 0.8951 - val_mae: 0.8020
Epoch 3/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 535209.2500 - loss: 1.2513 - mae: 0.9601 - val_dmae: 443291.3125 - val_loss: 0.8800 - val_mae: 0.7952
Epoch 4/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 529381.6875 - loss: 1.2268 - mae: 0.9497 - val_dmae: 436300.8125 - val_loss: 0.8528 - val_mae: 0.7827
Epoch 5/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 518687.8125 - loss: 1.1826 - mae: 0.9305 - val_dmae: 423372.2500 - val_loss: 0.8035 - val_mae: 0.7595
Epoch 6/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 499458.9688 - loss: 1.1044 - mae: 0.8960 - val_dmae: 401133.8438 - val_loss: 0.7227 - val_mae: 0.7196
Epoch 7/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 466482.4062 - loss: 0.9773 - mae: 0.8368 - val_dmae: 364942.8438 - val_loss: 0.6002 - val_mae: 0.6547
Epoch 8/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 411903.0312 - loss: 0.7847 - mae: 0.7389 - val_dmae: 308532.4062 - val_loss: 0.4345 - val_mae: 0.5535
Epoch 9/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 328275.8438 - loss: 0.5317 - mae: 0.5889 - val_dmae: 236963.0156 - val_loss: 0.2656 - val_mae: 0.4251
Epoch 10/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 224794.2812 - loss: 0.2873 - mae: 0.4033 - val_dmae: 190205.9688 - val_loss: 0.1850 - val_mae: 0.3412
Epoch 11/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 149743.4062 - loss: 0.1608 - mae: 0.2686 - val_dmae: 178606.9844 - val_loss: 0.1855 - val_mae: 0.3204
Epoch 12/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 132380.6406 - loss: 0.1441 - mae: 0.2375 - val_dmae: 174568.6406 - val_loss: 0.1837 - val_mae: 0.3132
Epoch 13/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 128074.0703 - loss: 0.1378 - mae: 0.2298 - val_dmae: 171316.6250 - val_loss: 0.1807 - val_mae: 0.3073
Epoch 14/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 123321.8047 - loss: 0.1309 - mae: 0.2212 - val_dmae: 168996.5781 - val_loss: 0.1795 - val_mae: 0.3032
Epoch 15/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 121276.4062 - loss: 0.1307 - mae: 0.2176 - val_dmae: 167217.9844 - val_loss: 0.1792 - val_mae: 0.3000
Epoch 16/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 119898.6719 - loss: 0.1266 - mae: 0.2151 - val_dmae: 165710.2500 - val_loss: 0.1779 - val_mae: 0.2973
Epoch 17/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 120748.2500 - loss: 0.1311 - mae: 0.2166 - val_dmae: 164155.7812 - val_loss: 0.1751 - val_mae: 0.2945
Epoch 18/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 118620.0938 - loss: 0.1283 - mae: 0.2128 - val_dmae: 163283.8750 - val_loss: 0.1744 - val_mae: 0.2929
Epoch 19/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 119612.2266 - loss: 0.1292 - mae: 0.2146 - val_dmae: 162294.1719 - val_loss: 0.1721 - val_mae: 0.2911
Epoch 20/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 117537.4141 - loss: 0.1259 - mae: 0.2108 - val_dmae: 161565.1562 - val_loss: 0.1715 - val_mae: 0.2898
Epoch 21/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 118515.7188 - loss: 0.1237 - mae: 0.2126 - val_dmae: 161125.4375 - val_loss: 0.1707 - val_mae: 0.2890
Epoch 22/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 116537.5859 - loss: 0.1236 - mae: 0.2091 - val_dmae: 160738.9531 - val_loss: 0.1703 - val_mae: 0.2883
Epoch 23/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 115967.6094 - loss: 0.1221 - mae: 0.2080 - val_dmae: 159929.4531 - val_loss: 0.1678 - val_mae: 0.2869
Epoch 24/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 116750.2656 - loss: 0.1233 - mae: 0.2094 - val_dmae: 159184.4531 - val_loss: 0.1662 - val_mae: 0.2856
Epoch 25/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114966.4922 - loss: 0.1223 - mae: 0.2062 - val_dmae: 158413.7344 - val_loss: 0.1646 - val_mae: 0.2842
Epoch 26/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114064.5000 - loss: 0.1196 - mae: 0.2046 - val_dmae: 157375.3125 - val_loss: 0.1621 - val_mae: 0.2823
Epoch 27/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114311.6562 - loss: 0.1190 - mae: 0.2051 - val_dmae: 156778.7031 - val_loss: 0.1611 - val_mae: 0.2812
Epoch 28/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114478.1328 - loss: 0.1203 - mae: 0.2054 - val_dmae: 155944.7656 - val_loss: 0.1593 - val_mae: 0.2797
Epoch 29/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113959.6016 - loss: 0.1200 - mae: 0.2044 - val_dmae: 154698.1406 - val_loss: 0.1567 - val_mae: 0.2775
Epoch 30/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112983.5547 - loss: 0.1161 - mae: 0.2027 - val_dmae: 154906.0625 - val_loss: 0.1580 - val_mae: 0.2779
Epoch 31/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 113660.1641 - loss: 0.1153 - mae: 0.2039 - val_dmae: 153222.2500 - val_loss: 0.1539 - val_mae: 0.2749
Epoch 32/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112292.4141 - loss: 0.1149 - mae: 0.2014 - val_dmae: 153329.9375 - val_loss: 0.1551 - val_mae: 0.2751
Epoch 33/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111415.6875 - loss: 0.1130 - mae: 0.1999 - val_dmae: 151707.0312 - val_loss: 0.1515 - val_mae: 0.2721
Epoch 34/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 111281.8516 - loss: 0.1122 - mae: 0.1996 - val_dmae: 151017.2188 - val_loss: 0.1500 - val_mae: 0.2709
Epoch 35/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 112849.5625 - loss: 0.1135 - mae: 0.2024 - val_dmae: 149998.5000 - val_loss: 0.1486 - val_mae: 0.2691
Epoch 36/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 110330.5859 - loss: 0.1082 - mae: 0.1979 - val_dmae: 149588.4531 - val_loss: 0.1481 - val_mae: 0.2683
Epoch 37/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 110554.2500 - loss: 0.1129 - mae: 0.1983 - val_dmae: 149249.8750 - val_loss: 0.1482 - val_mae: 0.2677
Epoch 38/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110144.3750 - loss: 0.1089 - mae: 0.1976 - val_dmae: 147686.2344 - val_loss: 0.1454 - val_mae: 0.2649
Epoch 39/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110666.6172 - loss: 0.1090 - mae: 0.1985 - val_dmae: 146978.9062 - val_loss: 0.1445 - val_mae: 0.2637
Epoch 40/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 109784.4375 - loss: 0.1084 - mae: 0.1969 - val_dmae: 146587.7031 - val_loss: 0.1440 - val_mae: 0.2630
Epoch 41/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 109904.5234 - loss: 0.1099 - mae: 0.1972 - val_dmae: 145439.9844 - val_loss: 0.1421 - val_mae: 0.2609
Epoch 42/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110131.3125 - loss: 0.1081 - mae: 0.1976 - val_dmae: 145447.8438 - val_loss: 0.1427 - val_mae: 0.2609
Epoch 43/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 109797.9609 - loss: 0.1075 - mae: 0.1970 - val_dmae: 144671.6094 - val_loss: 0.1416 - val_mae: 0.2595
Epoch 44/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 108804.9453 - loss: 0.1065 - mae: 0.1952 - val_dmae: 144289.3281 - val_loss: 0.1413 - val_mae: 0.2588
Epoch 45/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108217.3594 - loss: 0.1064 - mae: 0.1941 - val_dmae: 143205.9688 - val_loss: 0.1390 - val_mae: 0.2569
Epoch 46/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108837.2109 - loss: 0.1069 - mae: 0.1952 - val_dmae: 144665.3281 - val_loss: 0.1422 - val_mae: 0.2595
Epoch 47/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107000.0391 - loss: 0.1057 - mae: 0.1919 - val_dmae: 142787.6250 - val_loss: 0.1388 - val_mae: 0.2561
Epoch 48/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 108004.4297 - loss: 0.1049 - mae: 0.1937 - val_dmae: 142663.8125 - val_loss: 0.1387 - val_mae: 0.2559
Epoch 49/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 108662.1172 - loss: 0.1064 - mae: 0.1949 - val_dmae: 142986.2656 - val_loss: 0.1394 - val_mae: 0.2565
Epoch 50/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106500.9219 - loss: 0.1055 - mae: 0.1911 - val_dmae: 142966.2031 - val_loss: 0.1394 - val_mae: 0.2565
Epoch 51/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 108066.3672 - loss: 0.1040 - mae: 0.1939 - val_dmae: 141710.1406 - val_loss: 0.1377 - val_mae: 0.2542
Epoch 52/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108070.0625 - loss: 0.1044 - mae: 0.1939 - val_dmae: 142136.1719 - val_loss: 0.1380 - val_mae: 0.2550
Epoch 53/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108798.0391 - loss: 0.1051 - mae: 0.1952 - val_dmae: 141703.1719 - val_loss: 0.1376 - val_mae: 0.2542
Epoch 54/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108090.4453 - loss: 0.1038 - mae: 0.1939 - val_dmae: 142201.6094 - val_loss: 0.1383 - val_mae: 0.2551
Epoch 55/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 107303.2500 - loss: 0.1035 - mae: 0.1925 - val_dmae: 141007.3281 - val_loss: 0.1370 - val_mae: 0.2530
Epoch 56/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106790.6719 - loss: 0.1030 - mae: 0.1916 - val_dmae: 140742.4375 - val_loss: 0.1356 - val_mae: 0.2525
Epoch 57/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 108648.7656 - loss: 0.1031 - mae: 0.1949 - val_dmae: 140455.1562 - val_loss: 0.1362 - val_mae: 0.2520
Epoch 58/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105881.9219 - loss: 0.1018 - mae: 0.1899 - val_dmae: 139839.9688 - val_loss: 0.1347 - val_mae: 0.2509
Epoch 59/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107379.3359 - loss: 0.1027 - mae: 0.1926 - val_dmae: 140462.4531 - val_loss: 0.1358 - val_mae: 0.2520
Epoch 60/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106281.9766 - loss: 0.1009 - mae: 0.1907 - val_dmae: 140288.6875 - val_loss: 0.1358 - val_mae: 0.2517
Epoch 61/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107474.5000 - loss: 0.1025 - mae: 0.1928 - val_dmae: 139368.0156 - val_loss: 0.1339 - val_mae: 0.2500
Epoch 62/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105230.5703 - loss: 0.1008 - mae: 0.1888 - val_dmae: 140052.0312 - val_loss: 0.1355 - val_mae: 0.2512
Epoch 63/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107418.7188 - loss: 0.1024 - mae: 0.1927 - val_dmae: 139620.7500 - val_loss: 0.1344 - val_mae: 0.2505
Epoch 64/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106750.2500 - loss: 0.1021 - mae: 0.1915 - val_dmae: 139406.6094 - val_loss: 0.1340 - val_mae: 0.2501
Epoch 65/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105582.7500 - loss: 0.1015 - mae: 0.1894 - val_dmae: 139395.6094 - val_loss: 0.1343 - val_mae: 0.2501
Epoch 66/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105239.0000 - loss: 0.0998 - mae: 0.1888 - val_dmae: 138382.9062 - val_loss: 0.1332 - val_mae: 0.2482
Epoch 67/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106342.9844 - loss: 0.1037 - mae: 0.1908 - val_dmae: 138564.4375 - val_loss: 0.1330 - val_mae: 0.2486
Epoch 68/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105170.2578 - loss: 0.1006 - mae: 0.1887 - val_dmae: 137998.2656 - val_loss: 0.1322 - val_mae: 0.2476
Epoch 69/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104818.1641 - loss: 0.0981 - mae: 0.1880 - val_dmae: 139308.3438 - val_loss: 0.1347 - val_mae: 0.2499
Epoch 70/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106699.0078 - loss: 0.1004 - mae: 0.1914 - val_dmae: 139299.6562 - val_loss: 0.1341 - val_mae: 0.2499
Epoch 71/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105436.6016 - loss: 0.0982 - mae: 0.1891 - val_dmae: 138671.7500 - val_loss: 0.1336 - val_mae: 0.2488
Epoch 72/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106009.1016 - loss: 0.0989 - mae: 0.1902 - val_dmae: 139040.2031 - val_loss: 0.1338 - val_mae: 0.2494
Epoch 73/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 107466.4688 - loss: 0.1009 - mae: 0.1928 - val_dmae: 137619.5469 - val_loss: 0.1307 - val_mae: 0.2469
Epoch 74/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107295.4922 - loss: 0.1017 - mae: 0.1925 - val_dmae: 138153.8125 - val_loss: 0.1321 - val_mae: 0.2478
Epoch 75/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103870.9141 - loss: 0.0978 - mae: 0.1863 - val_dmae: 139248.2969 - val_loss: 0.1338 - val_mae: 0.2498
Epoch 76/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105554.5938 - loss: 0.1001 - mae: 0.1894 - val_dmae: 138325.0938 - val_loss: 0.1323 - val_mae: 0.2481
Epoch 77/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105591.8828 - loss: 0.0984 - mae: 0.1894 - val_dmae: 139266.5000 - val_loss: 0.1342 - val_mae: 0.2498
Epoch 78/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105442.5234 - loss: 0.0995 - mae: 0.1892 - val_dmae: 137623.3750 - val_loss: 0.1314 - val_mae: 0.2469
Epoch 78: early stopping
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - dmae: 63942.0352 - loss: 0.0242 - mae: 0.1147
Out[28]:
[0.02329966053366661, 61717.328125, 0.11071430891752243]
In [37]:
plot_series(model, [train, val, test], title='Predictions with S=3').show()
2024-04-11 13:13:27.407163: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 13:13:33.286880: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 13:13:39.721647: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence

By increasing the $S$ value we obtain the best MAE result of 61k points.

Regularization hyperparameters¶

There are other hyperparameters of the network with less explicability than the previous explained configurations (bidirectionality, GRU vs LSTM, dropout, dimension of the model, etc.). For those hyperparameters we prepared a grid search to obtain the best configuration. We did not include the output of this cell since it was executed from the terminal through a Python script and it required a large amount of time to finish. The best configuration obtained 15.3k points of MAE in the validation set and 60k points in the test set and used L2 regularization (with $\alpha=0.001$), random normal weight initialization and hyperbolic tangent activation function.

In [54]:
grid = OrderedDict(
    regularizer = [L1(1e-3), L2(1e-3), L1L2(1e-4)],
    initializer=['random_normal', 'glorot_uniform'],
    activation=['tanh', 'relu']
)

def applydeep(lists, func):
    result = []
    for item in lists:
        result.append(list(map(func, item)))
    return result

df = pd.DataFrame(columns=['train', 'val', 'test'], index=pd.MultiIndex.from_product(applydeep(grid.values(), str)))
# for i, params in enumerate(product(*grid.values())):
#     params = dict(zip(grid.keys(), params))
#     model = WalmartModel(seq_len=3, base_layer=GRU, num_encoder_layers=3, num_decoder_layers=2,  bidirectional=False, dropout=0.1,**params)
#     model.train(train, test, f'results/walmart.weights.h5', Adam(1e-4), batch_size=BATCH_SIZE)
#     (_, train_mae, _), (_, val_mae, _), (_, test_mae, _) = map(model.evaluate, (train, val, test))
#     df.loc[tuple(map(str, params.values()))] = [train_mae, val_mae, test_mae]
#     df.to_csv('grid.csv')
df = pd.read_csv('grid.csv', index_col=[0, 1, 2])
df.index.names = ['regularizer', 'initializer', 'activation']
df
Out[54]:
train val test
regularizer initializer activation
L1 random_normal tanh 97176.039062 160986.875000 65295.449219
relu 100092.359375 158176.218750 70685.218750
glorot_uniform tanh 95431.453125 155389.281250 65046.066406
relu 131795.890625 179003.968750 103291.867188
L2 random_normal tanh 91819.726562 153123.328125 60459.160156
relu 94866.132812 141971.531250 65930.945312
glorot_uniform tanh 99021.359375 145237.359375 76132.109375
relu 90433.320312 136006.000000 65732.023438
L1L2 random_normal tanh 102642.031250 150206.656250 71680.054688
relu 92494.460938 136899.968750 64125.890625
glorot_uniform tanh 103966.273438 156001.703125 74629.656250
relu 91238.843750 137254.921875 70335.656250
In [55]:
df.idxmin()
Out[55]:
train    (L2, glorot_uniform, relu)
val      (L2, glorot_uniform, relu)
test      (L2, random_normal, tanh)
dtype: object

Final comparison¶

In this section we make a final comparison of the GRU-based models with different sequence lengths and hidden dimensions with the optimal regularization obtained in the grid search.

In [57]:
seq_lens = [2, 3, 4]
hidden_size = [10, 20, 30]
dropout = [0.1, 0.1, 0.2]
params = list(zip(seq_lens, hidden_size, dropout))

comparison = pd.DataFrame(columns=['train', 'val', 'test'], index=pd.MultiIndex.from_tuples(params))
comparison.index.names = ['S', 'h', 'd']
models = []

for s, h, d in params:
    mod = WalmartModel(s, base_layer=GRU, hidden_size=h, num_encoder_layers=3, regularizer=L2(1e-3))
    mod.train(train, val, 'results/walm.weights.h5', Adam(1e-4), batch_size=BATCH_SIZE)
    models.append(mod)
    (_, train_mae, _), (_, val_mae, _), (_, test_mae, _) = map(mod.evaluate, (train, val, test))
    comparison.loc[(s,h,d)] = (train_mae, val_mae, test_mae)
print(comparison)
Epoch 1/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 5s 47ms/step - dmae: 541755.5000 - loss: 1.2926 - mae: 0.9719 - val_dmae: 486016.5312 - val_loss: 1.1759 - val_mae: 0.8719
Epoch 2/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 540741.0000 - loss: 1.2873 - mae: 0.9700 - val_dmae: 485439.1562 - val_loss: 1.1730 - val_mae: 0.8708
Epoch 3/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 540119.7500 - loss: 1.2842 - mae: 0.9689 - val_dmae: 484678.6562 - val_loss: 1.1695 - val_mae: 0.8695
Epoch 4/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 539263.3125 - loss: 1.2802 - mae: 0.9674 - val_dmae: 483638.4375 - val_loss: 1.1647 - val_mae: 0.8676
Epoch 5/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 538073.0625 - loss: 1.2748 - mae: 0.9652 - val_dmae: 482169.0312 - val_loss: 1.1580 - val_mae: 0.8650
Epoch 6/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 536261.8750 - loss: 1.2665 - mae: 0.9620 - val_dmae: 480014.0000 - val_loss: 1.1484 - val_mae: 0.8611
Epoch 7/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 533669.2500 - loss: 1.2550 - mae: 0.9573 - val_dmae: 476838.2812 - val_loss: 1.1342 - val_mae: 0.8554
Epoch 8/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 529775.0625 - loss: 1.2379 - mae: 0.9504 - val_dmae: 472159.6562 - val_loss: 1.1136 - val_mae: 0.8470
Epoch 9/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 524005.4688 - loss: 1.2129 - mae: 0.9400 - val_dmae: 465326.2188 - val_loss: 1.0836 - val_mae: 0.8347
Epoch 10/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 515679.2188 - loss: 1.1771 - mae: 0.9251 - val_dmae: 455551.4375 - val_loss: 1.0413 - val_mae: 0.8172
Epoch 11/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 503705.9062 - loss: 1.1268 - mae: 0.9036 - val_dmae: 441928.2812 - val_loss: 0.9833 - val_mae: 0.7928
Epoch 12/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 486855.1562 - loss: 1.0584 - mae: 0.8734 - val_dmae: 423729.6875 - val_loss: 0.9074 - val_mae: 0.7601
Epoch 13/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 464136.8125 - loss: 0.9691 - mae: 0.8326 - val_dmae: 400243.1250 - val_loss: 0.8132 - val_mae: 0.7180
Epoch 14/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 433702.2812 - loss: 0.8540 - mae: 0.7780 - val_dmae: 370699.5312 - val_loss: 0.7040 - val_mae: 0.6650
Epoch 15/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 395132.6250 - loss: 0.7210 - mae: 0.7088 - val_dmae: 336227.6250 - val_loss: 0.5882 - val_mae: 0.6032
Epoch 16/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 351264.4688 - loss: 0.5864 - mae: 0.6301 - val_dmae: 299477.1562 - val_loss: 0.4780 - val_mae: 0.5372
Epoch 17/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 302069.1250 - loss: 0.4515 - mae: 0.5419 - val_dmae: 264971.8750 - val_loss: 0.3849 - val_mae: 0.4753
Epoch 18/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 250734.3750 - loss: 0.3306 - mae: 0.4498 - val_dmae: 234372.6250 - val_loss: 0.3176 - val_mae: 0.4204
Epoch 19/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 204695.1562 - loss: 0.2456 - mae: 0.3672 - val_dmae: 209598.8438 - val_loss: 0.2769 - val_mae: 0.3760
Epoch 20/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 166627.9844 - loss: 0.1902 - mae: 0.2989 - val_dmae: 192391.5781 - val_loss: 0.2572 - val_mae: 0.3451
Epoch 21/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 140093.2344 - loss: 0.1561 - mae: 0.2513 - val_dmae: 182647.8594 - val_loss: 0.2495 - val_mae: 0.3277
Epoch 22/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 125034.7500 - loss: 0.1422 - mae: 0.2243 - val_dmae: 177629.9688 - val_loss: 0.2473 - val_mae: 0.3186
Epoch 23/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 118862.5000 - loss: 0.1341 - mae: 0.2132 - val_dmae: 174697.0781 - val_loss: 0.2466 - val_mae: 0.3134
Epoch 24/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 115640.1094 - loss: 0.1334 - mae: 0.2074 - val_dmae: 173343.5625 - val_loss: 0.2464 - val_mae: 0.3110
Epoch 25/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114540.4062 - loss: 0.1301 - mae: 0.2055 - val_dmae: 172248.9844 - val_loss: 0.2459 - val_mae: 0.3090
Epoch 26/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 113947.9297 - loss: 0.1308 - mae: 0.2044 - val_dmae: 171746.7812 - val_loss: 0.2456 - val_mae: 0.3081
Epoch 27/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 113167.3516 - loss: 0.1295 - mae: 0.2030 - val_dmae: 171283.3125 - val_loss: 0.2448 - val_mae: 0.3073
Epoch 28/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112855.5391 - loss: 0.1277 - mae: 0.2025 - val_dmae: 170890.9844 - val_loss: 0.2440 - val_mae: 0.3066
Epoch 29/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 113117.6953 - loss: 0.1290 - mae: 0.2029 - val_dmae: 170708.3281 - val_loss: 0.2435 - val_mae: 0.3062
Epoch 30/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112378.6641 - loss: 0.1282 - mae: 0.2016 - val_dmae: 170463.9062 - val_loss: 0.2428 - val_mae: 0.3058
Epoch 31/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111484.7109 - loss: 0.1273 - mae: 0.2000 - val_dmae: 170425.3281 - val_loss: 0.2424 - val_mae: 0.3057
Epoch 32/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113268.5078 - loss: 0.1278 - mae: 0.2032 - val_dmae: 170198.3594 - val_loss: 0.2418 - val_mae: 0.3053
Epoch 33/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 111964.3203 - loss: 0.1273 - mae: 0.2009 - val_dmae: 170014.6406 - val_loss: 0.2411 - val_mae: 0.3050
Epoch 34/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 113903.9688 - loss: 0.1276 - mae: 0.2043 - val_dmae: 169888.9375 - val_loss: 0.2407 - val_mae: 0.3048
Epoch 35/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 110414.2031 - loss: 0.1275 - mae: 0.1981 - val_dmae: 169696.6406 - val_loss: 0.2401 - val_mae: 0.3044
Epoch 36/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 110534.7344 - loss: 0.1229 - mae: 0.1983 - val_dmae: 169830.0000 - val_loss: 0.2400 - val_mae: 0.3047
Epoch 37/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 112959.2812 - loss: 0.1290 - mae: 0.2026 - val_dmae: 169604.3594 - val_loss: 0.2389 - val_mae: 0.3043
Epoch 38/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110933.2969 - loss: 0.1242 - mae: 0.1990 - val_dmae: 169718.3281 - val_loss: 0.2388 - val_mae: 0.3045
Epoch 39/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 111218.2891 - loss: 0.1249 - mae: 0.1995 - val_dmae: 169754.3281 - val_loss: 0.2387 - val_mae: 0.3045
Epoch 40/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 110833.2188 - loss: 0.1243 - mae: 0.1988 - val_dmae: 169582.9219 - val_loss: 0.2380 - val_mae: 0.3042
Epoch 41/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111141.0078 - loss: 0.1245 - mae: 0.1994 - val_dmae: 169408.5469 - val_loss: 0.2374 - val_mae: 0.3039
Epoch 42/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113713.8906 - loss: 0.1281 - mae: 0.2040 - val_dmae: 169398.2188 - val_loss: 0.2370 - val_mae: 0.3039
Epoch 43/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 110979.1875 - loss: 0.1263 - mae: 0.1991 - val_dmae: 169247.1094 - val_loss: 0.2361 - val_mae: 0.3036
Epoch 44/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 112750.1797 - loss: 0.1280 - mae: 0.2023 - val_dmae: 169293.0000 - val_loss: 0.2357 - val_mae: 0.3037
Epoch 45/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 110678.7891 - loss: 0.1245 - mae: 0.1985 - val_dmae: 169211.4375 - val_loss: 0.2351 - val_mae: 0.3035
Epoch 46/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 112826.8125 - loss: 0.1252 - mae: 0.2024 - val_dmae: 169099.6094 - val_loss: 0.2346 - val_mae: 0.3033
Epoch 47/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 112137.7109 - loss: 0.1251 - mae: 0.2012 - val_dmae: 169035.3750 - val_loss: 0.2345 - val_mae: 0.3032
Epoch 48/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111818.3828 - loss: 0.1253 - mae: 0.2006 - val_dmae: 169248.7188 - val_loss: 0.2343 - val_mae: 0.3036
Epoch 49/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 112693.6094 - loss: 0.1255 - mae: 0.2022 - val_dmae: 169230.4531 - val_loss: 0.2339 - val_mae: 0.3036
Epoch 50/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110990.3203 - loss: 0.1227 - mae: 0.1991 - val_dmae: 169093.6719 - val_loss: 0.2333 - val_mae: 0.3033
Epoch 51/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111977.1562 - loss: 0.1241 - mae: 0.2009 - val_dmae: 169120.1875 - val_loss: 0.2331 - val_mae: 0.3034
Epoch 52/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110496.5938 - loss: 0.1238 - mae: 0.1982 - val_dmae: 168821.8281 - val_loss: 0.2325 - val_mae: 0.3028
Epoch 53/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111404.1016 - loss: 0.1246 - mae: 0.1998 - val_dmae: 168894.9375 - val_loss: 0.2322 - val_mae: 0.3030
Epoch 54/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111326.8047 - loss: 0.1231 - mae: 0.1997 - val_dmae: 169074.4375 - val_loss: 0.2320 - val_mae: 0.3033
Epoch 55/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112666.7422 - loss: 0.1257 - mae: 0.2021 - val_dmae: 168820.8281 - val_loss: 0.2314 - val_mae: 0.3028
Epoch 56/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 112000.4453 - loss: 0.1226 - mae: 0.2009 - val_dmae: 168564.8594 - val_loss: 0.2308 - val_mae: 0.3024
Epoch 57/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110843.5781 - loss: 0.1218 - mae: 0.1988 - val_dmae: 168719.1719 - val_loss: 0.2305 - val_mae: 0.3027
Epoch 58/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112877.8047 - loss: 0.1234 - mae: 0.2025 - val_dmae: 168723.3750 - val_loss: 0.2301 - val_mae: 0.3027
Epoch 59/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112163.5859 - loss: 0.1225 - mae: 0.2012 - val_dmae: 168605.1406 - val_loss: 0.2296 - val_mae: 0.3025
Epoch 60/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110583.6094 - loss: 0.1226 - mae: 0.1984 - val_dmae: 168479.9062 - val_loss: 0.2291 - val_mae: 0.3022
Epoch 61/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112469.6250 - loss: 0.1228 - mae: 0.2018 - val_dmae: 168759.9375 - val_loss: 0.2295 - val_mae: 0.3027
Epoch 62/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111117.6406 - loss: 0.1223 - mae: 0.1993 - val_dmae: 168360.0938 - val_loss: 0.2283 - val_mae: 0.3020
Epoch 63/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112216.1875 - loss: 0.1245 - mae: 0.2013 - val_dmae: 168565.4531 - val_loss: 0.2284 - val_mae: 0.3024
Epoch 64/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110823.2188 - loss: 0.1206 - mae: 0.1988 - val_dmae: 168695.3125 - val_loss: 0.2284 - val_mae: 0.3026
Epoch 65/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 114080.9141 - loss: 0.1287 - mae: 0.2046 - val_dmae: 168485.3438 - val_loss: 0.2277 - val_mae: 0.3022
Epoch 66/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111221.4531 - loss: 0.1240 - mae: 0.1995 - val_dmae: 168523.7500 - val_loss: 0.2274 - val_mae: 0.3023
Epoch 67/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111075.2031 - loss: 0.1221 - mae: 0.1993 - val_dmae: 168274.5781 - val_loss: 0.2266 - val_mae: 0.3019
Epoch 68/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112207.9531 - loss: 0.1254 - mae: 0.2013 - val_dmae: 168271.8281 - val_loss: 0.2264 - val_mae: 0.3019
Epoch 69/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112099.9375 - loss: 0.1223 - mae: 0.2011 - val_dmae: 168356.6250 - val_loss: 0.2263 - val_mae: 0.3020
Epoch 70/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 111890.1797 - loss: 0.1217 - mae: 0.2007 - val_dmae: 168183.1875 - val_loss: 0.2262 - val_mae: 0.3017
Epoch 71/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113221.8438 - loss: 0.1235 - mae: 0.2031 - val_dmae: 168414.7656 - val_loss: 0.2261 - val_mae: 0.3021
Epoch 72/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 112450.7734 - loss: 0.1227 - mae: 0.2017 - val_dmae: 168139.8281 - val_loss: 0.2252 - val_mae: 0.3016
Epoch 73/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110827.8984 - loss: 0.1224 - mae: 0.1988 - val_dmae: 168277.1406 - val_loss: 0.2252 - val_mae: 0.3019
Epoch 74/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110466.4922 - loss: 0.1233 - mae: 0.1982 - val_dmae: 168526.6406 - val_loss: 0.2254 - val_mae: 0.3023
Epoch 75/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110760.6562 - loss: 0.1205 - mae: 0.1987 - val_dmae: 168219.6250 - val_loss: 0.2247 - val_mae: 0.3018
Epoch 76/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111289.1875 - loss: 0.1205 - mae: 0.1996 - val_dmae: 168292.6875 - val_loss: 0.2246 - val_mae: 0.3019
Epoch 77/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112492.0000 - loss: 0.1242 - mae: 0.2018 - val_dmae: 168260.2188 - val_loss: 0.2240 - val_mae: 0.3018
Epoch 77: early stopping
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - dmae: 103072.7188 - loss: 0.1149 - mae: 0.1849
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - dmae: 193433.1094 - loss: 0.2695 - mae: 0.3470
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - dmae: 65636.5156 - loss: 0.0422 - mae: 0.1177
Epoch 1/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 6s 48ms/step - dmae: 539863.9375 - loss: 1.3063 - mae: 0.9685 - val_dmae: 448109.9688 - val_loss: 0.9322 - val_mae: 0.8039
Epoch 2/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 536623.3750 - loss: 1.2900 - mae: 0.9626 - val_dmae: 445009.3125 - val_loss: 0.9182 - val_mae: 0.7983
Epoch 3/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 532757.8750 - loss: 1.2720 - mae: 0.9557 - val_dmae: 439740.0312 - val_loss: 0.8957 - val_mae: 0.7888
Epoch 4/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 525909.6875 - loss: 1.2413 - mae: 0.9434 - val_dmae: 430366.4688 - val_loss: 0.8575 - val_mae: 0.7720
Epoch 5/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 513507.7188 - loss: 1.1880 - mae: 0.9212 - val_dmae: 413762.8438 - val_loss: 0.7924 - val_mae: 0.7422
Epoch 6/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 491051.8125 - loss: 1.0947 - mae: 0.8809 - val_dmae: 385543.2188 - val_loss: 0.6882 - val_mae: 0.6916
Epoch 7/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 452673.9375 - loss: 0.9452 - mae: 0.8120 - val_dmae: 341481.1875 - val_loss: 0.5433 - val_mae: 0.6126
Epoch 8/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 392021.3750 - loss: 0.7302 - mae: 0.7032 - val_dmae: 284952.1562 - val_loss: 0.3889 - val_mae: 0.5112
Epoch 9/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 308481.7500 - loss: 0.4847 - mae: 0.5534 - val_dmae: 239886.0156 - val_loss: 0.2888 - val_mae: 0.4303
Epoch 10/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 219568.6562 - loss: 0.2916 - mae: 0.3939 - val_dmae: 211272.9219 - val_loss: 0.2667 - val_mae: 0.3790
Epoch 11/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 154411.8750 - loss: 0.1981 - mae: 0.2770 - val_dmae: 194205.4219 - val_loss: 0.2697 - val_mae: 0.3484
Epoch 12/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 129648.4297 - loss: 0.1741 - mae: 0.2326 - val_dmae: 184803.8906 - val_loss: 0.2655 - val_mae: 0.3315
Epoch 13/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 121537.8672 - loss: 0.1696 - mae: 0.2180 - val_dmae: 179690.3750 - val_loss: 0.2588 - val_mae: 0.3223
Epoch 14/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 117371.1719 - loss: 0.1646 - mae: 0.2106 - val_dmae: 176636.5000 - val_loss: 0.2539 - val_mae: 0.3169
Epoch 15/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 116032.6016 - loss: 0.1631 - mae: 0.2082 - val_dmae: 174515.6250 - val_loss: 0.2489 - val_mae: 0.3131
Epoch 16/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 115630.3359 - loss: 0.1614 - mae: 0.2074 - val_dmae: 173419.9219 - val_loss: 0.2469 - val_mae: 0.3111
Epoch 17/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114960.6016 - loss: 0.1609 - mae: 0.2062 - val_dmae: 172538.9688 - val_loss: 0.2441 - val_mae: 0.3095
Epoch 18/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114470.6797 - loss: 0.1588 - mae: 0.2053 - val_dmae: 171991.2031 - val_loss: 0.2424 - val_mae: 0.3085
Epoch 19/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 115152.3125 - loss: 0.1615 - mae: 0.2066 - val_dmae: 171650.1719 - val_loss: 0.2415 - val_mae: 0.3079
Epoch 20/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 113753.4453 - loss: 0.1583 - mae: 0.2041 - val_dmae: 171029.7969 - val_loss: 0.2391 - val_mae: 0.3068
Epoch 21/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 115437.0312 - loss: 0.1617 - mae: 0.2071 - val_dmae: 170597.5625 - val_loss: 0.2376 - val_mae: 0.3060
Epoch 22/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 113980.0000 - loss: 0.1560 - mae: 0.2045 - val_dmae: 170363.7656 - val_loss: 0.2370 - val_mae: 0.3056
Epoch 23/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114094.0859 - loss: 0.1562 - mae: 0.2047 - val_dmae: 170165.3438 - val_loss: 0.2360 - val_mae: 0.3053
Epoch 24/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114184.0391 - loss: 0.1563 - mae: 0.2048 - val_dmae: 169820.7188 - val_loss: 0.2348 - val_mae: 0.3046
Epoch 25/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114781.8828 - loss: 0.1565 - mae: 0.2059 - val_dmae: 169405.0469 - val_loss: 0.2330 - val_mae: 0.3039
Epoch 26/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113491.8672 - loss: 0.1549 - mae: 0.2036 - val_dmae: 169161.6719 - val_loss: 0.2319 - val_mae: 0.3035
Epoch 27/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114589.5547 - loss: 0.1560 - mae: 0.2056 - val_dmae: 169091.2656 - val_loss: 0.2317 - val_mae: 0.3033
Epoch 28/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 112725.0703 - loss: 0.1532 - mae: 0.2022 - val_dmae: 168850.4531 - val_loss: 0.2306 - val_mae: 0.3029
Epoch 29/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114025.0391 - loss: 0.1562 - mae: 0.2045 - val_dmae: 168487.0469 - val_loss: 0.2289 - val_mae: 0.3022
Epoch 30/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 112877.6719 - loss: 0.1548 - mae: 0.2025 - val_dmae: 168416.2656 - val_loss: 0.2285 - val_mae: 0.3021
Epoch 31/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113796.4844 - loss: 0.1537 - mae: 0.2041 - val_dmae: 168349.5781 - val_loss: 0.2281 - val_mae: 0.3020
Epoch 32/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113276.8984 - loss: 0.1520 - mae: 0.2032 - val_dmae: 168582.9531 - val_loss: 0.2292 - val_mae: 0.3024
Epoch 33/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112414.7109 - loss: 0.1520 - mae: 0.2017 - val_dmae: 168078.9531 - val_loss: 0.2269 - val_mae: 0.3015
Epoch 34/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113504.9297 - loss: 0.1542 - mae: 0.2036 - val_dmae: 167869.4375 - val_loss: 0.2260 - val_mae: 0.3011
Epoch 35/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 114082.0156 - loss: 0.1527 - mae: 0.2047 - val_dmae: 167823.6250 - val_loss: 0.2258 - val_mae: 0.3011
Epoch 36/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 114634.7500 - loss: 0.1520 - mae: 0.2056 - val_dmae: 167706.7500 - val_loss: 0.2251 - val_mae: 0.3008
Epoch 37/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112912.6875 - loss: 0.1515 - mae: 0.2026 - val_dmae: 167606.3438 - val_loss: 0.2246 - val_mae: 0.3007
Epoch 38/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 112682.2734 - loss: 0.1520 - mae: 0.2021 - val_dmae: 167534.5469 - val_loss: 0.2242 - val_mae: 0.3005
Epoch 39/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 114028.4609 - loss: 0.1529 - mae: 0.2046 - val_dmae: 167333.6719 - val_loss: 0.2235 - val_mae: 0.3002
Epoch 40/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 112882.1953 - loss: 0.1505 - mae: 0.2025 - val_dmae: 167228.1719 - val_loss: 0.2226 - val_mae: 0.3000
Epoch 41/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113924.5625 - loss: 0.1508 - mae: 0.2044 - val_dmae: 167171.2812 - val_loss: 0.2222 - val_mae: 0.2999
Epoch 42/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 113331.3047 - loss: 0.1509 - mae: 0.2033 - val_dmae: 167154.7031 - val_loss: 0.2220 - val_mae: 0.2999
Epoch 43/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112643.8984 - loss: 0.1482 - mae: 0.2021 - val_dmae: 167133.7969 - val_loss: 0.2219 - val_mae: 0.2998
Epoch 44/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 113981.6719 - loss: 0.1504 - mae: 0.2045 - val_dmae: 166658.8750 - val_loss: 0.2204 - val_mae: 0.2990
Epoch 45/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 112627.7266 - loss: 0.1495 - mae: 0.2020 - val_dmae: 166463.5625 - val_loss: 0.2195 - val_mae: 0.2986
Epoch 46/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113775.5156 - loss: 0.1510 - mae: 0.2041 - val_dmae: 166371.7188 - val_loss: 0.2188 - val_mae: 0.2985
Epoch 47/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113860.0391 - loss: 0.1498 - mae: 0.2043 - val_dmae: 166245.8594 - val_loss: 0.2180 - val_mae: 0.2982
Epoch 48/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 112618.0938 - loss: 0.1468 - mae: 0.2020 - val_dmae: 165862.8750 - val_loss: 0.2167 - val_mae: 0.2975
Epoch 49/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113784.8438 - loss: 0.1493 - mae: 0.2041 - val_dmae: 166319.9688 - val_loss: 0.2184 - val_mae: 0.2984
Epoch 50/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113663.5781 - loss: 0.1489 - mae: 0.2039 - val_dmae: 166230.7500 - val_loss: 0.2177 - val_mae: 0.2982
Epoch 51/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113636.0156 - loss: 0.1490 - mae: 0.2039 - val_dmae: 166033.4844 - val_loss: 0.2169 - val_mae: 0.2978
Epoch 52/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112797.0938 - loss: 0.1476 - mae: 0.2023 - val_dmae: 165979.0781 - val_loss: 0.2164 - val_mae: 0.2977
Epoch 53/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112685.0312 - loss: 0.1475 - mae: 0.2021 - val_dmae: 165870.1250 - val_loss: 0.2161 - val_mae: 0.2976
Epoch 53: early stopping
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - dmae: 109373.4453 - loss: 0.1458 - mae: 0.1962
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - dmae: 190476.1094 - loss: 0.2470 - mae: 0.3417
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - dmae: 67127.4766 - loss: 0.0512 - mae: 0.1204
Epoch 1/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 6s 48ms/step - dmae: 542885.8750 - loss: 1.3482 - mae: 0.9739 - val_dmae: 452001.5938 - val_loss: 0.9708 - val_mae: 0.8108
Epoch 2/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 538225.2500 - loss: 1.3233 - mae: 0.9655 - val_dmae: 447985.7500 - val_loss: 0.9504 - val_mae: 0.8036
Epoch 3/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 533006.0625 - loss: 1.2973 - mae: 0.9562 - val_dmae: 441164.9062 - val_loss: 0.9186 - val_mae: 0.7914
Epoch 4/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 523366.1250 - loss: 1.2526 - mae: 0.9389 - val_dmae: 427614.5312 - val_loss: 0.8600 - val_mae: 0.7671
Epoch 5/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 503529.6562 - loss: 1.1662 - mae: 0.9033 - val_dmae: 399042.0000 - val_loss: 0.7472 - val_mae: 0.7158
Epoch 6/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 461037.6562 - loss: 0.9945 - mae: 0.8271 - val_dmae: 340372.9062 - val_loss: 0.5544 - val_mae: 0.6106
Epoch 7/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 376205.1250 - loss: 0.6994 - mae: 0.6749 - val_dmae: 257555.6094 - val_loss: 0.3470 - val_mae: 0.4620
Epoch 8/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 247772.5938 - loss: 0.3686 - mae: 0.4445 - val_dmae: 203696.2344 - val_loss: 0.2810 - val_mae: 0.3654
Epoch 9/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 146143.1250 - loss: 0.2138 - mae: 0.2622 - val_dmae: 189576.6406 - val_loss: 0.2784 - val_mae: 0.3401
Epoch 10/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 121219.1719 - loss: 0.1864 - mae: 0.2175 - val_dmae: 185585.1250 - val_loss: 0.2699 - val_mae: 0.3329
Epoch 11/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 118221.6094 - loss: 0.1824 - mae: 0.2121 - val_dmae: 184950.5000 - val_loss: 0.2674 - val_mae: 0.3318
Epoch 12/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 116267.0703 - loss: 0.1809 - mae: 0.2086 - val_dmae: 184574.0938 - val_loss: 0.2659 - val_mae: 0.3311
Epoch 13/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 116197.8516 - loss: 0.1791 - mae: 0.2084 - val_dmae: 183642.5469 - val_loss: 0.2623 - val_mae: 0.3294
Epoch 14/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 115603.2266 - loss: 0.1794 - mae: 0.2074 - val_dmae: 183710.5938 - val_loss: 0.2621 - val_mae: 0.3296
Epoch 15/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 115182.0391 - loss: 0.1766 - mae: 0.2066 - val_dmae: 183358.7656 - val_loss: 0.2608 - val_mae: 0.3289
Epoch 16/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 116147.2969 - loss: 0.1766 - mae: 0.2084 - val_dmae: 182747.1250 - val_loss: 0.2588 - val_mae: 0.3278
Epoch 17/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114122.9141 - loss: 0.1738 - mae: 0.2047 - val_dmae: 182535.3594 - val_loss: 0.2576 - val_mae: 0.3274
Epoch 18/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 115725.6484 - loss: 0.1767 - mae: 0.2076 - val_dmae: 182212.6406 - val_loss: 0.2562 - val_mae: 0.3269
Epoch 19/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 114727.6328 - loss: 0.1737 - mae: 0.2058 - val_dmae: 181869.4375 - val_loss: 0.2547 - val_mae: 0.3263
Epoch 20/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 115268.4609 - loss: 0.1757 - mae: 0.2068 - val_dmae: 181436.7344 - val_loss: 0.2526 - val_mae: 0.3255
Epoch 21/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 116048.9609 - loss: 0.1746 - mae: 0.2082 - val_dmae: 181065.0781 - val_loss: 0.2519 - val_mae: 0.3248
Epoch 22/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 115443.7734 - loss: 0.1721 - mae: 0.2071 - val_dmae: 180717.7031 - val_loss: 0.2503 - val_mae: 0.3242
Epoch 23/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114931.9609 - loss: 0.1719 - mae: 0.2062 - val_dmae: 180457.0000 - val_loss: 0.2490 - val_mae: 0.3237
Epoch 24/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114084.3984 - loss: 0.1706 - mae: 0.2047 - val_dmae: 180484.0938 - val_loss: 0.2490 - val_mae: 0.3238
Epoch 25/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 115568.7031 - loss: 0.1725 - mae: 0.2073 - val_dmae: 180107.5625 - val_loss: 0.2468 - val_mae: 0.3231
Epoch 26/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 114938.2812 - loss: 0.1700 - mae: 0.2062 - val_dmae: 179746.3438 - val_loss: 0.2466 - val_mae: 0.3224
Epoch 27/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113814.6484 - loss: 0.1687 - mae: 0.2042 - val_dmae: 179346.9062 - val_loss: 0.2444 - val_mae: 0.3217
Epoch 28/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113970.5703 - loss: 0.1688 - mae: 0.2045 - val_dmae: 179194.1719 - val_loss: 0.2440 - val_mae: 0.3215
Epoch 29/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 114810.4453 - loss: 0.1698 - mae: 0.2060 - val_dmae: 179089.5938 - val_loss: 0.2434 - val_mae: 0.3213
Epoch 30/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113767.2656 - loss: 0.1672 - mae: 0.2041 - val_dmae: 178877.4844 - val_loss: 0.2420 - val_mae: 0.3209
Epoch 31/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114477.7656 - loss: 0.1674 - mae: 0.2054 - val_dmae: 178418.7656 - val_loss: 0.2407 - val_mae: 0.3201
Epoch 32/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113481.8750 - loss: 0.1662 - mae: 0.2036 - val_dmae: 178532.7812 - val_loss: 0.2415 - val_mae: 0.3203
Epoch 33/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114179.4609 - loss: 0.1655 - mae: 0.2048 - val_dmae: 178016.9219 - val_loss: 0.2394 - val_mae: 0.3193
Epoch 34/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113420.6406 - loss: 0.1674 - mae: 0.2035 - val_dmae: 178001.3906 - val_loss: 0.2389 - val_mae: 0.3193
Epoch 35/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113883.0859 - loss: 0.1649 - mae: 0.2043 - val_dmae: 177642.5312 - val_loss: 0.2376 - val_mae: 0.3187
Epoch 36/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114357.0156 - loss: 0.1658 - mae: 0.2051 - val_dmae: 177638.3438 - val_loss: 0.2374 - val_mae: 0.3187
Epoch 37/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113707.1406 - loss: 0.1645 - mae: 0.2040 - val_dmae: 177141.9688 - val_loss: 0.2358 - val_mae: 0.3178
Epoch 38/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114193.8672 - loss: 0.1647 - mae: 0.2049 - val_dmae: 176986.1094 - val_loss: 0.2351 - val_mae: 0.3175
Epoch 39/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113468.7734 - loss: 0.1636 - mae: 0.2036 - val_dmae: 176987.1719 - val_loss: 0.2353 - val_mae: 0.3175
Epoch 40/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 113509.4375 - loss: 0.1631 - mae: 0.2036 - val_dmae: 176546.4531 - val_loss: 0.2339 - val_mae: 0.3167
Epoch 41/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113332.3047 - loss: 0.1614 - mae: 0.2033 - val_dmae: 176279.2188 - val_loss: 0.2329 - val_mae: 0.3162
Epoch 42/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 114312.2969 - loss: 0.1629 - mae: 0.2051 - val_dmae: 176585.4062 - val_loss: 0.2336 - val_mae: 0.3168
Epoch 43/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 113721.4375 - loss: 0.1635 - mae: 0.2040 - val_dmae: 175714.6094 - val_loss: 0.2312 - val_mae: 0.3152
Epoch 44/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113881.9219 - loss: 0.1614 - mae: 0.2043 - val_dmae: 175766.3750 - val_loss: 0.2303 - val_mae: 0.3153
Epoch 45/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113196.1406 - loss: 0.1609 - mae: 0.2031 - val_dmae: 175584.0781 - val_loss: 0.2304 - val_mae: 0.3150
Epoch 46/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112604.2266 - loss: 0.1602 - mae: 0.2020 - val_dmae: 175290.9688 - val_loss: 0.2292 - val_mae: 0.3145
Epoch 47/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112892.9766 - loss: 0.1601 - mae: 0.2025 - val_dmae: 175278.9844 - val_loss: 0.2290 - val_mae: 0.3144
Epoch 48/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113597.7188 - loss: 0.1608 - mae: 0.2038 - val_dmae: 174942.3906 - val_loss: 0.2282 - val_mae: 0.3138
Epoch 49/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 113157.9453 - loss: 0.1597 - mae: 0.2030 - val_dmae: 174906.5938 - val_loss: 0.2283 - val_mae: 0.3138
Epoch 50/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113202.1406 - loss: 0.1603 - mae: 0.2031 - val_dmae: 174502.4375 - val_loss: 0.2268 - val_mae: 0.3130
Epoch 51/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112746.9219 - loss: 0.1589 - mae: 0.2023 - val_dmae: 174505.8438 - val_loss: 0.2263 - val_mae: 0.3130
Epoch 52/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114172.2266 - loss: 0.1601 - mae: 0.2048 - val_dmae: 174127.7500 - val_loss: 0.2258 - val_mae: 0.3124
Epoch 53/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112831.9766 - loss: 0.1577 - mae: 0.2024 - val_dmae: 174099.3750 - val_loss: 0.2253 - val_mae: 0.3123
Epoch 54/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113775.4531 - loss: 0.1595 - mae: 0.2041 - val_dmae: 173874.9375 - val_loss: 0.2246 - val_mae: 0.3119
Epoch 55/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113231.6016 - loss: 0.1567 - mae: 0.2031 - val_dmae: 173853.2656 - val_loss: 0.2245 - val_mae: 0.3119
Epoch 56/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113297.1875 - loss: 0.1568 - mae: 0.2032 - val_dmae: 173188.1250 - val_loss: 0.2224 - val_mae: 0.3107
Epoch 57/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114520.8828 - loss: 0.1590 - mae: 0.2054 - val_dmae: 173510.7344 - val_loss: 0.2231 - val_mae: 0.3113
Epoch 58/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113246.2344 - loss: 0.1551 - mae: 0.2032 - val_dmae: 173298.4688 - val_loss: 0.2230 - val_mae: 0.3109
Epoch 59/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113344.5078 - loss: 0.1566 - mae: 0.2033 - val_dmae: 173130.9219 - val_loss: 0.2223 - val_mae: 0.3106
Epoch 60/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112634.7109 - loss: 0.1561 - mae: 0.2021 - val_dmae: 172955.0000 - val_loss: 0.2217 - val_mae: 0.3103
Epoch 61/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 114302.8281 - loss: 0.1586 - mae: 0.2050 - val_dmae: 172550.3750 - val_loss: 0.2203 - val_mae: 0.3095
Epoch 62/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 113152.3828 - loss: 0.1559 - mae: 0.2030 - val_dmae: 172691.8594 - val_loss: 0.2212 - val_mae: 0.3098
Epoch 63/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 113472.9453 - loss: 0.1560 - mae: 0.2036 - val_dmae: 172608.3594 - val_loss: 0.2208 - val_mae: 0.3096
Epoch 64/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112696.6406 - loss: 0.1546 - mae: 0.2022 - val_dmae: 172125.3281 - val_loss: 0.2192 - val_mae: 0.3088
Epoch 65/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112876.3359 - loss: 0.1552 - mae: 0.2025 - val_dmae: 172371.1406 - val_loss: 0.2200 - val_mae: 0.3092
Epoch 66/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111703.9141 - loss: 0.1524 - mae: 0.2004 - val_dmae: 171963.5625 - val_loss: 0.2191 - val_mae: 0.3085
Epoch 67/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112363.0547 - loss: 0.1536 - mae: 0.2016 - val_dmae: 171886.1406 - val_loss: 0.2190 - val_mae: 0.3083
Epoch 68/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112355.9531 - loss: 0.1539 - mae: 0.2016 - val_dmae: 171554.4062 - val_loss: 0.2171 - val_mae: 0.3078
Epoch 69/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 112820.3438 - loss: 0.1537 - mae: 0.2024 - val_dmae: 171484.4688 - val_loss: 0.2176 - val_mae: 0.3076
Epoch 70/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112648.9688 - loss: 0.1523 - mae: 0.2021 - val_dmae: 171098.6250 - val_loss: 0.2160 - val_mae: 0.3069
Epoch 71/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 111562.0859 - loss: 0.1519 - mae: 0.2001 - val_dmae: 171032.4531 - val_loss: 0.2162 - val_mae: 0.3068
Epoch 72/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111745.5156 - loss: 0.1502 - mae: 0.2005 - val_dmae: 170872.5469 - val_loss: 0.2155 - val_mae: 0.3065
Epoch 73/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 112469.1016 - loss: 0.1516 - mae: 0.2018 - val_dmae: 170707.5781 - val_loss: 0.2152 - val_mae: 0.3062
Epoch 74/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112073.8125 - loss: 0.1523 - mae: 0.2010 - val_dmae: 170779.2344 - val_loss: 0.2156 - val_mae: 0.3064
Epoch 75/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111442.1562 - loss: 0.1500 - mae: 0.1999 - val_dmae: 170311.8438 - val_loss: 0.2141 - val_mae: 0.3055
Epoch 76/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112289.5703 - loss: 0.1500 - mae: 0.2014 - val_dmae: 170276.3750 - val_loss: 0.2139 - val_mae: 0.3055
Epoch 77/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111965.1406 - loss: 0.1512 - mae: 0.2009 - val_dmae: 170089.2031 - val_loss: 0.2136 - val_mae: 0.3051
Epoch 78/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112433.2344 - loss: 0.1492 - mae: 0.2017 - val_dmae: 169798.8438 - val_loss: 0.2124 - val_mae: 0.3046
Epoch 79/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110583.9219 - loss: 0.1491 - mae: 0.1984 - val_dmae: 170001.8281 - val_loss: 0.2133 - val_mae: 0.3050
Epoch 80/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112145.4219 - loss: 0.1507 - mae: 0.2012 - val_dmae: 169714.3906 - val_loss: 0.2126 - val_mae: 0.3044
Epoch 81/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 112070.4141 - loss: 0.1494 - mae: 0.2010 - val_dmae: 169390.2344 - val_loss: 0.2113 - val_mae: 0.3039
Epoch 82/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111551.1328 - loss: 0.1477 - mae: 0.2001 - val_dmae: 169156.8125 - val_loss: 0.2110 - val_mae: 0.3034
Epoch 83/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111506.8828 - loss: 0.1482 - mae: 0.2000 - val_dmae: 169065.0938 - val_loss: 0.2110 - val_mae: 0.3033
Epoch 84/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 112052.8359 - loss: 0.1485 - mae: 0.2010 - val_dmae: 168809.9688 - val_loss: 0.2102 - val_mae: 0.3028
Epoch 85/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111165.3672 - loss: 0.1465 - mae: 0.1994 - val_dmae: 168738.1875 - val_loss: 0.2106 - val_mae: 0.3027
Epoch 86/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111549.8672 - loss: 0.1474 - mae: 0.2001 - val_dmae: 168561.5781 - val_loss: 0.2099 - val_mae: 0.3024
Epoch 87/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110146.1719 - loss: 0.1449 - mae: 0.1976 - val_dmae: 168332.7812 - val_loss: 0.2096 - val_mae: 0.3020
Epoch 88/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110569.7188 - loss: 0.1453 - mae: 0.1984 - val_dmae: 168340.5312 - val_loss: 0.2096 - val_mae: 0.3020
Epoch 89/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111318.3359 - loss: 0.1464 - mae: 0.1997 - val_dmae: 168265.2031 - val_loss: 0.2096 - val_mae: 0.3018
Epoch 90/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111418.6016 - loss: 0.1467 - mae: 0.1999 - val_dmae: 167888.0312 - val_loss: 0.2075 - val_mae: 0.3012
Epoch 91/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111008.3594 - loss: 0.1456 - mae: 0.1991 - val_dmae: 167326.2656 - val_loss: 0.2074 - val_mae: 0.3002
Epoch 92/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111454.3594 - loss: 0.1458 - mae: 0.1999 - val_dmae: 167539.7188 - val_loss: 0.2079 - val_mae: 0.3005
Epoch 93/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111437.6016 - loss: 0.1470 - mae: 0.1999 - val_dmae: 167408.5469 - val_loss: 0.2071 - val_mae: 0.3003
Epoch 94/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 111337.7266 - loss: 0.1450 - mae: 0.1997 - val_dmae: 167287.9531 - val_loss: 0.2067 - val_mae: 0.3001
Epoch 95/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111009.7734 - loss: 0.1444 - mae: 0.1991 - val_dmae: 166764.6562 - val_loss: 0.2060 - val_mae: 0.2992
Epoch 96/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 112575.5938 - loss: 0.1462 - mae: 0.2019 - val_dmae: 166680.3281 - val_loss: 0.2061 - val_mae: 0.2990
Epoch 97/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110468.9219 - loss: 0.1434 - mae: 0.1982 - val_dmae: 166697.9531 - val_loss: 0.2063 - val_mae: 0.2990
Epoch 98/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 109908.2266 - loss: 0.1433 - mae: 0.1972 - val_dmae: 166212.7500 - val_loss: 0.2054 - val_mae: 0.2982
Epoch 99/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111006.3438 - loss: 0.1441 - mae: 0.1991 - val_dmae: 166284.1406 - val_loss: 0.2048 - val_mae: 0.2983
Epoch 100/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 111148.8516 - loss: 0.1446 - mae: 0.1994 - val_dmae: 166068.9531 - val_loss: 0.2048 - val_mae: 0.2979
Epoch 101/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111114.7031 - loss: 0.1448 - mae: 0.1993 - val_dmae: 165887.8906 - val_loss: 0.2038 - val_mae: 0.2976
Epoch 102/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 110200.5000 - loss: 0.1428 - mae: 0.1977 - val_dmae: 165821.2188 - val_loss: 0.2048 - val_mae: 0.2975
Epoch 103/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 109903.6094 - loss: 0.1421 - mae: 0.1972 - val_dmae: 165756.4062 - val_loss: 0.2041 - val_mae: 0.2973
Epoch 104/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 109893.7344 - loss: 0.1422 - mae: 0.1971 - val_dmae: 165389.0938 - val_loss: 0.2036 - val_mae: 0.2967
Epoch 105/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110429.0469 - loss: 0.1418 - mae: 0.1981 - val_dmae: 165008.1562 - val_loss: 0.2026 - val_mae: 0.2960
Epoch 106/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 111308.4922 - loss: 0.1430 - mae: 0.1997 - val_dmae: 164972.7344 - val_loss: 0.2024 - val_mae: 0.2959
Epoch 107/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 109730.4531 - loss: 0.1407 - mae: 0.1968 - val_dmae: 164707.2344 - val_loss: 0.2027 - val_mae: 0.2955
Epoch 108/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 109646.6172 - loss: 0.1403 - mae: 0.1967 - val_dmae: 164284.5625 - val_loss: 0.2013 - val_mae: 0.2947
Epoch 109/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110816.6484 - loss: 0.1414 - mae: 0.1988 - val_dmae: 164516.9219 - val_loss: 0.2018 - val_mae: 0.2951
Epoch 110/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 110476.6875 - loss: 0.1419 - mae: 0.1982 - val_dmae: 164568.8906 - val_loss: 0.2027 - val_mae: 0.2952
Epoch 111/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 109142.9141 - loss: 0.1404 - mae: 0.1958 - val_dmae: 164372.5625 - val_loss: 0.2020 - val_mae: 0.2949
Epoch 112/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110030.5234 - loss: 0.1418 - mae: 0.1974 - val_dmae: 164095.7031 - val_loss: 0.2019 - val_mae: 0.2944
Epoch 113/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110019.6641 - loss: 0.1406 - mae: 0.1974 - val_dmae: 163739.1094 - val_loss: 0.2007 - val_mae: 0.2937
Epoch 114/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110041.9922 - loss: 0.1397 - mae: 0.1974 - val_dmae: 163688.5000 - val_loss: 0.2006 - val_mae: 0.2936
Epoch 115/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110769.0938 - loss: 0.1398 - mae: 0.1987 - val_dmae: 163619.8906 - val_loss: 0.2011 - val_mae: 0.2935
Epoch 116/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110223.7031 - loss: 0.1413 - mae: 0.1977 - val_dmae: 163317.0469 - val_loss: 0.2003 - val_mae: 0.2930
Epoch 117/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 109623.9219 - loss: 0.1396 - mae: 0.1967 - val_dmae: 162866.0156 - val_loss: 0.1992 - val_mae: 0.2922
Epoch 118/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 109856.1953 - loss: 0.1394 - mae: 0.1971 - val_dmae: 163101.1406 - val_loss: 0.2005 - val_mae: 0.2926
Epoch 119/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 109650.6641 - loss: 0.1386 - mae: 0.1967 - val_dmae: 162652.8281 - val_loss: 0.1992 - val_mae: 0.2918
Epoch 120/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110608.4141 - loss: 0.1382 - mae: 0.1984 - val_dmae: 162358.2344 - val_loss: 0.1984 - val_mae: 0.2913
Epoch 121/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 110852.3281 - loss: 0.1396 - mae: 0.1989 - val_dmae: 162718.1562 - val_loss: 0.1997 - val_mae: 0.2919
Epoch 122/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108651.3047 - loss: 0.1364 - mae: 0.1949 - val_dmae: 162184.3438 - val_loss: 0.1985 - val_mae: 0.2909
Epoch 123/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 109277.0156 - loss: 0.1383 - mae: 0.1960 - val_dmae: 162023.1094 - val_loss: 0.1984 - val_mae: 0.2907
Epoch 124/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 109824.9609 - loss: 0.1393 - mae: 0.1970 - val_dmae: 161662.4219 - val_loss: 0.1975 - val_mae: 0.2900
Epoch 125/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 2s 33ms/step - dmae: 109876.3125 - loss: 0.1379 - mae: 0.1971 - val_dmae: 161445.0156 - val_loss: 0.1974 - val_mae: 0.2896
Epoch 126/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 109235.0469 - loss: 0.1382 - mae: 0.1960 - val_dmae: 161456.7812 - val_loss: 0.1977 - val_mae: 0.2896
Epoch 127/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108460.0859 - loss: 0.1368 - mae: 0.1946 - val_dmae: 161381.3750 - val_loss: 0.1975 - val_mae: 0.2895
Epoch 128/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108401.3516 - loss: 0.1365 - mae: 0.1945 - val_dmae: 161289.5469 - val_loss: 0.1976 - val_mae: 0.2893
Epoch 129/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108617.3203 - loss: 0.1359 - mae: 0.1948 - val_dmae: 160974.2812 - val_loss: 0.1971 - val_mae: 0.2888
Epoch 130/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 109655.6875 - loss: 0.1370 - mae: 0.1967 - val_dmae: 161102.2188 - val_loss: 0.1975 - val_mae: 0.2890
Epoch 131/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 109398.1094 - loss: 0.1364 - mae: 0.1962 - val_dmae: 160521.3438 - val_loss: 0.1961 - val_mae: 0.2880
Epoch 132/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 109680.5938 - loss: 0.1369 - mae: 0.1968 - val_dmae: 160421.3125 - val_loss: 0.1961 - val_mae: 0.2878
Epoch 133/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 109436.8516 - loss: 0.1355 - mae: 0.1963 - val_dmae: 160297.8594 - val_loss: 0.1961 - val_mae: 0.2876
Epoch 134/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 109202.5625 - loss: 0.1355 - mae: 0.1959 - val_dmae: 160080.2344 - val_loss: 0.1950 - val_mae: 0.2872
Epoch 135/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 108586.0859 - loss: 0.1342 - mae: 0.1948 - val_dmae: 159884.8125 - val_loss: 0.1944 - val_mae: 0.2868
Epoch 136/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 108862.0469 - loss: 0.1343 - mae: 0.1953 - val_dmae: 159621.0156 - val_loss: 0.1947 - val_mae: 0.2863
Epoch 137/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108813.6250 - loss: 0.1351 - mae: 0.1952 - val_dmae: 159446.7812 - val_loss: 0.1944 - val_mae: 0.2860
Epoch 138/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108841.1406 - loss: 0.1348 - mae: 0.1952 - val_dmae: 159017.5312 - val_loss: 0.1937 - val_mae: 0.2853
Epoch 139/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108340.0547 - loss: 0.1351 - mae: 0.1944 - val_dmae: 158870.3438 - val_loss: 0.1931 - val_mae: 0.2850
Epoch 140/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108056.6094 - loss: 0.1336 - mae: 0.1938 - val_dmae: 158991.8906 - val_loss: 0.1933 - val_mae: 0.2852
Epoch 141/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108719.6719 - loss: 0.1350 - mae: 0.1950 - val_dmae: 158521.5000 - val_loss: 0.1923 - val_mae: 0.2844
Epoch 142/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107508.9688 - loss: 0.1342 - mae: 0.1929 - val_dmae: 158642.7344 - val_loss: 0.1930 - val_mae: 0.2846
Epoch 143/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108340.4062 - loss: 0.1350 - mae: 0.1944 - val_dmae: 158773.4219 - val_loss: 0.1940 - val_mae: 0.2848
Epoch 144/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107945.5703 - loss: 0.1320 - mae: 0.1936 - val_dmae: 158089.4219 - val_loss: 0.1918 - val_mae: 0.2836
Epoch 145/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108849.5781 - loss: 0.1330 - mae: 0.1953 - val_dmae: 158349.7031 - val_loss: 0.1930 - val_mae: 0.2841
Epoch 146/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108568.9453 - loss: 0.1342 - mae: 0.1948 - val_dmae: 157932.9844 - val_loss: 0.1917 - val_mae: 0.2833
Epoch 147/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108152.3906 - loss: 0.1321 - mae: 0.1940 - val_dmae: 157788.4531 - val_loss: 0.1918 - val_mae: 0.2831
Epoch 148/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108145.6562 - loss: 0.1332 - mae: 0.1940 - val_dmae: 157675.0156 - val_loss: 0.1916 - val_mae: 0.2829
Epoch 149/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107755.1953 - loss: 0.1315 - mae: 0.1933 - val_dmae: 157719.0000 - val_loss: 0.1922 - val_mae: 0.2829
Epoch 150/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108307.6484 - loss: 0.1335 - mae: 0.1943 - val_dmae: 157404.5469 - val_loss: 0.1919 - val_mae: 0.2824
Epoch 151/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108318.4609 - loss: 0.1339 - mae: 0.1943 - val_dmae: 157079.5312 - val_loss: 0.1915 - val_mae: 0.2818
Epoch 152/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107493.1641 - loss: 0.1311 - mae: 0.1928 - val_dmae: 157264.6094 - val_loss: 0.1918 - val_mae: 0.2821
Epoch 153/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107576.0859 - loss: 0.1312 - mae: 0.1930 - val_dmae: 157086.2969 - val_loss: 0.1917 - val_mae: 0.2818
Epoch 154/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107891.5234 - loss: 0.1327 - mae: 0.1935 - val_dmae: 156732.2812 - val_loss: 0.1909 - val_mae: 0.2812
Epoch 155/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108039.8281 - loss: 0.1315 - mae: 0.1938 - val_dmae: 156483.6719 - val_loss: 0.1898 - val_mae: 0.2807
Epoch 156/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 107596.4062 - loss: 0.1316 - mae: 0.1930 - val_dmae: 156406.0469 - val_loss: 0.1901 - val_mae: 0.2806
Epoch 157/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108016.4375 - loss: 0.1306 - mae: 0.1938 - val_dmae: 156262.7188 - val_loss: 0.1897 - val_mae: 0.2803
Epoch 158/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 107137.2031 - loss: 0.1301 - mae: 0.1922 - val_dmae: 155973.8125 - val_loss: 0.1896 - val_mae: 0.2798
Epoch 159/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107873.5469 - loss: 0.1311 - mae: 0.1935 - val_dmae: 155879.9219 - val_loss: 0.1887 - val_mae: 0.2796
Epoch 160/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108657.2031 - loss: 0.1322 - mae: 0.1949 - val_dmae: 155886.8750 - val_loss: 0.1887 - val_mae: 0.2796
Epoch 161/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106294.1641 - loss: 0.1290 - mae: 0.1907 - val_dmae: 155701.0469 - val_loss: 0.1890 - val_mae: 0.2793
Epoch 162/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107009.3359 - loss: 0.1302 - mae: 0.1920 - val_dmae: 155284.2344 - val_loss: 0.1880 - val_mae: 0.2786
Epoch 163/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107031.7344 - loss: 0.1304 - mae: 0.1920 - val_dmae: 154977.6875 - val_loss: 0.1870 - val_mae: 0.2780
Epoch 164/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107101.6719 - loss: 0.1301 - mae: 0.1921 - val_dmae: 155357.2812 - val_loss: 0.1886 - val_mae: 0.2787
Epoch 165/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106978.8984 - loss: 0.1296 - mae: 0.1919 - val_dmae: 155292.6719 - val_loss: 0.1889 - val_mae: 0.2786
Epoch 166/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106413.1484 - loss: 0.1288 - mae: 0.1909 - val_dmae: 154936.2031 - val_loss: 0.1879 - val_mae: 0.2779
Epoch 167/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107115.4531 - loss: 0.1282 - mae: 0.1922 - val_dmae: 154901.1562 - val_loss: 0.1875 - val_mae: 0.2779
Epoch 168/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 108510.7031 - loss: 0.1314 - mae: 0.1947 - val_dmae: 154478.8906 - val_loss: 0.1868 - val_mae: 0.2771
Epoch 169/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 107430.1406 - loss: 0.1302 - mae: 0.1927 - val_dmae: 154379.1094 - val_loss: 0.1863 - val_mae: 0.2769
Epoch 170/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106800.5078 - loss: 0.1285 - mae: 0.1916 - val_dmae: 154272.1406 - val_loss: 0.1869 - val_mae: 0.2767
Epoch 171/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106055.7656 - loss: 0.1284 - mae: 0.1903 - val_dmae: 154087.9844 - val_loss: 0.1863 - val_mae: 0.2764
Epoch 172/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107727.5859 - loss: 0.1305 - mae: 0.1933 - val_dmae: 154042.3125 - val_loss: 0.1863 - val_mae: 0.2763
Epoch 173/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107156.7266 - loss: 0.1291 - mae: 0.1922 - val_dmae: 153795.9531 - val_loss: 0.1856 - val_mae: 0.2759
Epoch 174/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 107074.2656 - loss: 0.1284 - mae: 0.1921 - val_dmae: 153568.9688 - val_loss: 0.1853 - val_mae: 0.2755
Epoch 175/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107450.9062 - loss: 0.1289 - mae: 0.1928 - val_dmae: 153427.0156 - val_loss: 0.1852 - val_mae: 0.2752
Epoch 176/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106387.4297 - loss: 0.1267 - mae: 0.1908 - val_dmae: 153504.4062 - val_loss: 0.1858 - val_mae: 0.2754
Epoch 177/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105184.5234 - loss: 0.1276 - mae: 0.1887 - val_dmae: 153414.1562 - val_loss: 0.1861 - val_mae: 0.2752
Epoch 178/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 106221.9297 - loss: 0.1276 - mae: 0.1906 - val_dmae: 153026.5312 - val_loss: 0.1849 - val_mae: 0.2745
Epoch 179/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106126.6172 - loss: 0.1262 - mae: 0.1904 - val_dmae: 152686.9688 - val_loss: 0.1841 - val_mae: 0.2739
Epoch 180/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 106487.2500 - loss: 0.1290 - mae: 0.1910 - val_dmae: 152574.4219 - val_loss: 0.1838 - val_mae: 0.2737
Epoch 181/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 107159.1250 - loss: 0.1285 - mae: 0.1922 - val_dmae: 152574.5625 - val_loss: 0.1838 - val_mae: 0.2737
Epoch 182/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105440.7891 - loss: 0.1259 - mae: 0.1891 - val_dmae: 152559.5000 - val_loss: 0.1842 - val_mae: 0.2737
Epoch 183/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105807.7500 - loss: 0.1266 - mae: 0.1898 - val_dmae: 152236.9219 - val_loss: 0.1838 - val_mae: 0.2731
Epoch 184/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 106526.8828 - loss: 0.1262 - mae: 0.1911 - val_dmae: 152305.2344 - val_loss: 0.1843 - val_mae: 0.2732
Epoch 185/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 105957.4766 - loss: 0.1266 - mae: 0.1901 - val_dmae: 152092.9062 - val_loss: 0.1840 - val_mae: 0.2728
Epoch 186/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 106030.1250 - loss: 0.1270 - mae: 0.1902 - val_dmae: 151704.0312 - val_loss: 0.1827 - val_mae: 0.2721
Epoch 187/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105955.5938 - loss: 0.1267 - mae: 0.1901 - val_dmae: 151623.6406 - val_loss: 0.1830 - val_mae: 0.2720
Epoch 188/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104480.4609 - loss: 0.1249 - mae: 0.1874 - val_dmae: 151746.4844 - val_loss: 0.1827 - val_mae: 0.2722
Epoch 189/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 106214.8906 - loss: 0.1266 - mae: 0.1905 - val_dmae: 151468.5469 - val_loss: 0.1830 - val_mae: 0.2717
Epoch 190/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 106024.4609 - loss: 0.1258 - mae: 0.1902 - val_dmae: 151407.7656 - val_loss: 0.1820 - val_mae: 0.2716
Epoch 191/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105541.5859 - loss: 0.1246 - mae: 0.1893 - val_dmae: 151270.4688 - val_loss: 0.1822 - val_mae: 0.2714
Epoch 192/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104323.5156 - loss: 0.1247 - mae: 0.1871 - val_dmae: 151207.7656 - val_loss: 0.1824 - val_mae: 0.2713
Epoch 193/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 106373.6719 - loss: 0.1246 - mae: 0.1908 - val_dmae: 150761.5156 - val_loss: 0.1814 - val_mae: 0.2705
Epoch 194/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105235.9297 - loss: 0.1240 - mae: 0.1888 - val_dmae: 150923.3125 - val_loss: 0.1821 - val_mae: 0.2707
Epoch 195/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104805.8125 - loss: 0.1248 - mae: 0.1880 - val_dmae: 150390.7188 - val_loss: 0.1804 - val_mae: 0.2698
Epoch 196/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105523.6172 - loss: 0.1238 - mae: 0.1893 - val_dmae: 150527.9844 - val_loss: 0.1815 - val_mae: 0.2700
Epoch 197/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105714.2734 - loss: 0.1259 - mae: 0.1896 - val_dmae: 150276.6094 - val_loss: 0.1806 - val_mae: 0.2696
Epoch 198/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104803.9297 - loss: 0.1230 - mae: 0.1880 - val_dmae: 150406.7188 - val_loss: 0.1815 - val_mae: 0.2698
Epoch 199/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104709.1172 - loss: 0.1235 - mae: 0.1878 - val_dmae: 150211.8906 - val_loss: 0.1810 - val_mae: 0.2695
Epoch 200/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105053.7422 - loss: 0.1233 - mae: 0.1885 - val_dmae: 149657.1250 - val_loss: 0.1791 - val_mae: 0.2685
Epoch 201/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104053.8047 - loss: 0.1229 - mae: 0.1867 - val_dmae: 149649.0312 - val_loss: 0.1798 - val_mae: 0.2685
Epoch 202/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 105387.2500 - loss: 0.1221 - mae: 0.1891 - val_dmae: 149414.8438 - val_loss: 0.1790 - val_mae: 0.2680
Epoch 203/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105187.0547 - loss: 0.1244 - mae: 0.1887 - val_dmae: 149460.5781 - val_loss: 0.1797 - val_mae: 0.2681
Epoch 204/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 105261.3984 - loss: 0.1235 - mae: 0.1888 - val_dmae: 149303.2812 - val_loss: 0.1794 - val_mae: 0.2678
Epoch 205/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 104740.0312 - loss: 0.1247 - mae: 0.1879 - val_dmae: 149366.5469 - val_loss: 0.1793 - val_mae: 0.2679
Epoch 206/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105798.8672 - loss: 0.1238 - mae: 0.1898 - val_dmae: 149121.7656 - val_loss: 0.1793 - val_mae: 0.2675
Epoch 207/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105465.5938 - loss: 0.1247 - mae: 0.1892 - val_dmae: 149005.5625 - val_loss: 0.1787 - val_mae: 0.2673
Epoch 208/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103861.7109 - loss: 0.1224 - mae: 0.1863 - val_dmae: 148555.6094 - val_loss: 0.1771 - val_mae: 0.2665
Epoch 209/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105528.9844 - loss: 0.1246 - mae: 0.1893 - val_dmae: 148496.3750 - val_loss: 0.1770 - val_mae: 0.2664
Epoch 210/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103595.3359 - loss: 0.1220 - mae: 0.1858 - val_dmae: 148510.2500 - val_loss: 0.1784 - val_mae: 0.2664
Epoch 211/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 104518.0156 - loss: 0.1225 - mae: 0.1875 - val_dmae: 148526.7500 - val_loss: 0.1777 - val_mae: 0.2664
Epoch 212/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105067.3672 - loss: 0.1221 - mae: 0.1885 - val_dmae: 148147.7188 - val_loss: 0.1772 - val_mae: 0.2658
Epoch 213/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 104695.2656 - loss: 0.1223 - mae: 0.1878 - val_dmae: 148290.7969 - val_loss: 0.1778 - val_mae: 0.2660
Epoch 214/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104618.8984 - loss: 0.1223 - mae: 0.1877 - val_dmae: 148022.7344 - val_loss: 0.1775 - val_mae: 0.2655
Epoch 215/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104250.8438 - loss: 0.1230 - mae: 0.1870 - val_dmae: 147912.5469 - val_loss: 0.1773 - val_mae: 0.2653
Epoch 216/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103550.7031 - loss: 0.1195 - mae: 0.1858 - val_dmae: 147608.1406 - val_loss: 0.1761 - val_mae: 0.2648
Epoch 217/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103646.9141 - loss: 0.1209 - mae: 0.1859 - val_dmae: 147640.2969 - val_loss: 0.1767 - val_mae: 0.2649
Epoch 218/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 105241.2656 - loss: 0.1225 - mae: 0.1888 - val_dmae: 147483.8594 - val_loss: 0.1761 - val_mae: 0.2646
Epoch 219/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104939.9609 - loss: 0.1230 - mae: 0.1883 - val_dmae: 147576.8594 - val_loss: 0.1769 - val_mae: 0.2647
Epoch 220/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103845.0391 - loss: 0.1202 - mae: 0.1863 - val_dmae: 147448.6875 - val_loss: 0.1763 - val_mae: 0.2645
Epoch 221/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103532.2109 - loss: 0.1202 - mae: 0.1857 - val_dmae: 147212.7812 - val_loss: 0.1755 - val_mae: 0.2641
Epoch 222/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104195.4688 - loss: 0.1203 - mae: 0.1869 - val_dmae: 147055.0625 - val_loss: 0.1759 - val_mae: 0.2638
Epoch 223/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104057.0000 - loss: 0.1205 - mae: 0.1867 - val_dmae: 147257.3125 - val_loss: 0.1766 - val_mae: 0.2642
Epoch 224/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103649.0156 - loss: 0.1197 - mae: 0.1859 - val_dmae: 146918.3281 - val_loss: 0.1752 - val_mae: 0.2636
Epoch 225/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104231.2109 - loss: 0.1211 - mae: 0.1870 - val_dmae: 146647.1719 - val_loss: 0.1741 - val_mae: 0.2631
Epoch 226/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 103827.5781 - loss: 0.1216 - mae: 0.1863 - val_dmae: 146601.3281 - val_loss: 0.1752 - val_mae: 0.2630
Epoch 227/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 103904.1094 - loss: 0.1211 - mae: 0.1864 - val_dmae: 146471.7188 - val_loss: 0.1744 - val_mae: 0.2628
Epoch 228/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103020.2578 - loss: 0.1208 - mae: 0.1848 - val_dmae: 146603.7812 - val_loss: 0.1750 - val_mae: 0.2630
Epoch 229/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103537.2578 - loss: 0.1208 - mae: 0.1857 - val_dmae: 146576.8594 - val_loss: 0.1742 - val_mae: 0.2629
Epoch 230/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 104040.5312 - loss: 0.1201 - mae: 0.1866 - val_dmae: 146457.2188 - val_loss: 0.1741 - val_mae: 0.2627
Epoch 231/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 104493.2969 - loss: 0.1215 - mae: 0.1874 - val_dmae: 146353.3281 - val_loss: 0.1740 - val_mae: 0.2625
Epoch 232/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102973.3359 - loss: 0.1188 - mae: 0.1847 - val_dmae: 146157.4219 - val_loss: 0.1746 - val_mae: 0.2622
Epoch 233/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102695.7266 - loss: 0.1186 - mae: 0.1842 - val_dmae: 146187.2812 - val_loss: 0.1737 - val_mae: 0.2622
Epoch 234/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 103857.3438 - loss: 0.1198 - mae: 0.1863 - val_dmae: 146064.4219 - val_loss: 0.1738 - val_mae: 0.2620
Epoch 235/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 103117.6484 - loss: 0.1209 - mae: 0.1850 - val_dmae: 145737.9219 - val_loss: 0.1731 - val_mae: 0.2614
Epoch 236/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 104014.7656 - loss: 0.1206 - mae: 0.1866 - val_dmae: 145696.0625 - val_loss: 0.1728 - val_mae: 0.2614
Epoch 237/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103221.2188 - loss: 0.1190 - mae: 0.1852 - val_dmae: 145680.2031 - val_loss: 0.1728 - val_mae: 0.2613
Epoch 238/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 104014.7656 - loss: 0.1215 - mae: 0.1866 - val_dmae: 145807.4688 - val_loss: 0.1736 - val_mae: 0.2616
Epoch 239/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 104184.9375 - loss: 0.1195 - mae: 0.1869 - val_dmae: 145496.9219 - val_loss: 0.1727 - val_mae: 0.2610
Epoch 240/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103694.4688 - loss: 0.1193 - mae: 0.1860 - val_dmae: 145300.8594 - val_loss: 0.1725 - val_mae: 0.2607
Epoch 241/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102590.7578 - loss: 0.1189 - mae: 0.1840 - val_dmae: 145547.6719 - val_loss: 0.1724 - val_mae: 0.2611
Epoch 242/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102978.3203 - loss: 0.1195 - mae: 0.1847 - val_dmae: 145220.7656 - val_loss: 0.1724 - val_mae: 0.2605
Epoch 243/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103583.5938 - loss: 0.1208 - mae: 0.1858 - val_dmae: 145261.2969 - val_loss: 0.1717 - val_mae: 0.2606
Epoch 244/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103518.3359 - loss: 0.1187 - mae: 0.1857 - val_dmae: 145046.7188 - val_loss: 0.1719 - val_mae: 0.2602
Epoch 245/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102786.3984 - loss: 0.1184 - mae: 0.1844 - val_dmae: 145312.6406 - val_loss: 0.1722 - val_mae: 0.2607
Epoch 246/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102611.7734 - loss: 0.1185 - mae: 0.1841 - val_dmae: 145306.1719 - val_loss: 0.1723 - val_mae: 0.2607
Epoch 247/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102552.2109 - loss: 0.1172 - mae: 0.1840 - val_dmae: 144908.2188 - val_loss: 0.1716 - val_mae: 0.2599
Epoch 248/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 103532.7188 - loss: 0.1191 - mae: 0.1857 - val_dmae: 144799.7812 - val_loss: 0.1708 - val_mae: 0.2598
Epoch 249/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 104344.7578 - loss: 0.1198 - mae: 0.1872 - val_dmae: 144842.2969 - val_loss: 0.1709 - val_mae: 0.2598
Epoch 250/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 101612.7188 - loss: 0.1174 - mae: 0.1823 - val_dmae: 144824.4531 - val_loss: 0.1704 - val_mae: 0.2598
Epoch 251/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102107.8828 - loss: 0.1170 - mae: 0.1832 - val_dmae: 144831.1094 - val_loss: 0.1707 - val_mae: 0.2598
Epoch 252/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102120.5078 - loss: 0.1164 - mae: 0.1832 - val_dmae: 144638.8125 - val_loss: 0.1704 - val_mae: 0.2595
Epoch 253/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103386.5859 - loss: 0.1174 - mae: 0.1855 - val_dmae: 144971.1406 - val_loss: 0.1716 - val_mae: 0.2601
Epoch 254/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 103261.4297 - loss: 0.1195 - mae: 0.1852 - val_dmae: 144808.3906 - val_loss: 0.1707 - val_mae: 0.2598
Epoch 255/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103291.4688 - loss: 0.1181 - mae: 0.1853 - val_dmae: 144538.8594 - val_loss: 0.1698 - val_mae: 0.2593
Epoch 256/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103484.4844 - loss: 0.1185 - mae: 0.1856 - val_dmae: 144443.0625 - val_loss: 0.1701 - val_mae: 0.2591
Epoch 257/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102667.3359 - loss: 0.1172 - mae: 0.1842 - val_dmae: 144333.7812 - val_loss: 0.1691 - val_mae: 0.2589
Epoch 258/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 102771.0234 - loss: 0.1172 - mae: 0.1844 - val_dmae: 144702.7031 - val_loss: 0.1703 - val_mae: 0.2596
Epoch 259/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 101648.8047 - loss: 0.1157 - mae: 0.1823 - val_dmae: 144211.5781 - val_loss: 0.1696 - val_mae: 0.2587
Epoch 260/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103089.7422 - loss: 0.1181 - mae: 0.1849 - val_dmae: 144391.2656 - val_loss: 0.1695 - val_mae: 0.2590
Epoch 261/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 102337.8594 - loss: 0.1164 - mae: 0.1836 - val_dmae: 144162.4375 - val_loss: 0.1689 - val_mae: 0.2586
Epoch 262/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102832.7500 - loss: 0.1153 - mae: 0.1845 - val_dmae: 144459.8281 - val_loss: 0.1694 - val_mae: 0.2591
Epoch 263/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102736.5156 - loss: 0.1167 - mae: 0.1843 - val_dmae: 144180.4844 - val_loss: 0.1689 - val_mae: 0.2586
Epoch 264/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102649.2578 - loss: 0.1180 - mae: 0.1841 - val_dmae: 144091.4375 - val_loss: 0.1687 - val_mae: 0.2585
Epoch 265/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 101288.4219 - loss: 0.1145 - mae: 0.1817 - val_dmae: 144019.6250 - val_loss: 0.1686 - val_mae: 0.2584
Epoch 266/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 103338.3906 - loss: 0.1159 - mae: 0.1854 - val_dmae: 144141.5469 - val_loss: 0.1687 - val_mae: 0.2586
Epoch 267/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102956.6641 - loss: 0.1152 - mae: 0.1847 - val_dmae: 144166.7969 - val_loss: 0.1686 - val_mae: 0.2586
Epoch 268/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 101090.4141 - loss: 0.1143 - mae: 0.1813 - val_dmae: 144004.5000 - val_loss: 0.1684 - val_mae: 0.2583
Epoch 269/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 101609.6953 - loss: 0.1170 - mae: 0.1823 - val_dmae: 144194.0000 - val_loss: 0.1687 - val_mae: 0.2587
Epoch 270/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102124.1953 - loss: 0.1144 - mae: 0.1832 - val_dmae: 144196.0938 - val_loss: 0.1688 - val_mae: 0.2587
Epoch 271/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102352.0312 - loss: 0.1164 - mae: 0.1836 - val_dmae: 143632.0938 - val_loss: 0.1677 - val_mae: 0.2577
Epoch 272/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - dmae: 101420.8750 - loss: 0.1143 - mae: 0.1819 - val_dmae: 143945.0156 - val_loss: 0.1685 - val_mae: 0.2582
Epoch 273/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102747.9062 - loss: 0.1168 - mae: 0.1843 - val_dmae: 143590.8125 - val_loss: 0.1670 - val_mae: 0.2576
Epoch 274/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102121.2812 - loss: 0.1153 - mae: 0.1832 - val_dmae: 143827.0625 - val_loss: 0.1672 - val_mae: 0.2580
Epoch 275/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102270.9219 - loss: 0.1160 - mae: 0.1835 - val_dmae: 143992.2656 - val_loss: 0.1684 - val_mae: 0.2583
Epoch 276/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102169.9062 - loss: 0.1147 - mae: 0.1833 - val_dmae: 143778.7344 - val_loss: 0.1676 - val_mae: 0.2579
Epoch 277/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102397.2969 - loss: 0.1168 - mae: 0.1837 - val_dmae: 143679.9531 - val_loss: 0.1671 - val_mae: 0.2577
Epoch 278/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102442.7109 - loss: 0.1155 - mae: 0.1838 - val_dmae: 143576.2969 - val_loss: 0.1666 - val_mae: 0.2576
Epoch 279/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 101838.4062 - loss: 0.1148 - mae: 0.1827 - val_dmae: 143576.9844 - val_loss: 0.1664 - val_mae: 0.2576
Epoch 280/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 33ms/step - dmae: 102502.2969 - loss: 0.1151 - mae: 0.1839 - val_dmae: 143352.0312 - val_loss: 0.1662 - val_mae: 0.2572
Epoch 281/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102242.9141 - loss: 0.1156 - mae: 0.1834 - val_dmae: 143803.0000 - val_loss: 0.1674 - val_mae: 0.2580
Epoch 282/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 101008.7344 - loss: 0.1131 - mae: 0.1812 - val_dmae: 143481.4219 - val_loss: 0.1664 - val_mae: 0.2574
Epoch 283/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102167.6562 - loss: 0.1148 - mae: 0.1833 - val_dmae: 143665.0312 - val_loss: 0.1669 - val_mae: 0.2577
Epoch 284/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 102118.6406 - loss: 0.1138 - mae: 0.1832 - val_dmae: 143381.9375 - val_loss: 0.1658 - val_mae: 0.2572
Epoch 285/2000
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 32ms/step - dmae: 101206.8281 - loss: 0.1131 - mae: 0.1816 - val_dmae: 143790.5781 - val_loss: 0.1668 - val_mae: 0.2579
Epoch 285: early stopping
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - dmae: 98235.1250 - loss: 0.1126 - mae: 0.1762
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 15ms/step - dmae: 167197.1406 - loss: 0.1959 - mae: 0.2999
45/45 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - dmae: 64439.6758 - loss: 0.0429 - mae: 0.1156
                 train            val          test
S h  d                                             
2 10 0.1  92564.273438  168139.828125  61744.265625
3 20 0.1  97539.421875     165862.875  62486.363281
4 30 0.2  87536.265625   143352.03125  59724.261719

The best model performing model uses $S=4$ observations to predict the target sales. Although it is the largest architecture (in terms of number of parameters), it is able to generalize and maintain a good performance in unseen samples through regularization techniques. The next figure displays the errors between these three configuration.

In [58]:
plot_errors(models, [train, val, test])
2024-04-11 14:40:36.183662: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 14:40:41.622256: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 14:40:46.900794: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 14:40:54.065497: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 14:40:59.688378: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 14:41:05.015327: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 14:41:10.169108: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 14:41:16.689938: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence
2024-04-11 14:41:21.993791: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence